diff --git a/docs/docs/components/custom.mdx b/docs/docs/components/custom.mdx
index 1880fcb0d..1c039bd2a 100644
--- a/docs/docs/components/custom.mdx
+++ b/docs/docs/components/custom.mdx
@@ -3,7 +3,8 @@ import Admonition from "@theme/Admonition";
# Custom Components
- Read the [Custom Component Guidelines](../administration/custom-component) for detailed information on custom components.
+ Read the [Custom Component Guidelines](../administration/custom-component) for
+ detailed information on custom components.
Custom components let you extend Langflow by creating reusable and configurable components from a Python script.
@@ -31,57 +32,60 @@ This class is the foundation for creating custom components. It allows users to
The following types are supported in the build method:
-| Supported Types |
-| --------------------------------------------------------- |
-| _`str`_, _`int`_, _`float`_, _`bool`_, _`list`_, _`dict`_ |
-| _`langflow.field_typing.NestedDict`_ |
-| _`langflow.field_typing.Prompt`_ |
-| _`langchain.chains.base.Chain`_ |
-| _`langchain.PromptTemplate`_ |
+| Supported Types |
+| ----------------------------------------------------------------- |
+| _`str`_, _`int`_, _`float`_, _`bool`_, _`list`_, _`dict`_ |
+| _`langflow.field_typing.NestedDict`_ |
+| _`langflow.field_typing.Prompt`_ |
+| _`langchain.chains.base.Chain`_ |
+| _`langchain.PromptTemplate`_ |
| _`from langchain.schema.language_model import BaseLanguageModel`_ |
-| _`langchain.Tool`_ |
-| _`langchain.document_loaders.base.BaseLoader`_ |
-| _`langchain.schema.Document`_ |
-| _`langchain.text_splitters.TextSplitter`_ |
-| _`langchain.vectorstores.base.VectorStore`_ |
-| _`langchain.embeddings.base.Embeddings`_ |
-| _`langchain.schema.BaseRetriever`_ |
+| _`langchain.Tool`_ |
+| _`langchain.document_loaders.base.BaseLoader`_ |
+| _`langchain.schema.Document`_ |
+| _`langchain.text_splitters.TextSplitter`_ |
+| _`langchain.vectorstores.base.VectorStore`_ |
+| _`langchain.embeddings.base.Embeddings`_ |
+| _`langchain.schema.BaseRetriever`_ |
The difference between _`dict`_ and _`langflow.field_typing.NestedDict`_ is that one adds a simple key-value pair field, while the other opens a more robust dictionary editor.
- Use the `Prompt` type by adding **kwargs to the build method.
- If you want to add the values of the variables to the template you defined, format the `PromptTemplate` inside the `CustomComponent` class.
+ Use the `Prompt` type by adding **kwargs to the build method. If you want to
+ add the values of the variables to the template you defined, format the
+ `PromptTemplate` inside the `CustomComponent` class.
- Use base Python types without a handle by default. To add handles, use the `input_types` key in the `build_config` method.
+ Use base Python types without a handle by default. To add handles, use the
+ `input_types` key in the `build_config` method.
**build_config:** Defines the configuration fields of the component. This method returns a dictionary where each key represents a field name and each value defines the field's behavior.
Supported keys for configuring fields:
-| Key | Description |
-| --------------------- | --------------------------------------------------- |
-| `is_list` | Boolean indicating if the field can hold multiple values. |
-| `options` | Dropdown menu options. |
-| `multiline` | Boolean indicating if a field allows multiline input. |
-| `input_types` | Allows connection handles for string fields. |
-| `display_name` | Field name displayed in the UI. |
-| `advanced` | Hides the field in the default UI view. |
-| `password` | Masks input, useful for sensitive data. |
-| `required` | Overrides the default behavior to make a field mandatory. |
-| `info` | Tooltip for the field. |
-| `file_types` | Accepted file types, useful for file fields. |
-| `range_spec` | Defines valid ranges for float fields. |
-| `title_case` | Boolean that controls field name capitalization. |
-| `refresh_button` | Adds a refresh button that updates field values. |
-| `real_time_refresh` | Updates the configuration as field values change. |
-| `field_type` | Automatically set based on the build method's type hint. |
+| Key | Description |
+| ------------------- | --------------------------------------------------------- |
+| `is_list` | Boolean indicating if the field can hold multiple values. |
+| `options` | Dropdown menu options. |
+| `multiline` | Boolean indicating if a field allows multiline input. |
+| `input_types` | Allows connection handles for string fields. |
+| `display_name` | Field name displayed in the UI. |
+| `advanced` | Hides the field in the default UI view. |
+| `password` | Masks input, useful for sensitive data. |
+| `required` | Overrides the default behavior to make a field mandatory. |
+| `info` | Tooltip for the field. |
+| `file_types` | Accepted file types, useful for file fields. |
+| `range_spec` | Defines valid ranges for float fields. |
+| `title_case` | Boolean that controls field name capitalization. |
+| `refresh_button` | Adds a refresh button that updates field values. |
+| `real_time_refresh` | Updates the configuration as field values change. |
+| `field_type` | Automatically set based on the build method's type hint. |
- Use the `update_build_config` method to dynamically update configurations based on field values.
+ Use the `update_build_config` method to dynamically update configurations
+ based on field values.
## Additional methods and attributes
@@ -99,4 +103,3 @@ The `CustomComponent` class also provides helpful methods for specific tasks (e.
- `status`: Shows values from the `build` method, useful for debugging.
- `field_order`: Controls the display order of fields.
- `icon`: Sets the canvas display icon.
-
diff --git a/docs/docs/examples/chat-memory.mdx b/docs/docs/examples/chat-memory.mdx
index d9b7d2e20..88dbbca2b 100644
--- a/docs/docs/examples/chat-memory.mdx
+++ b/docs/docs/examples/chat-memory.mdx
@@ -14,4 +14,4 @@ This component is available under the **Helpers** tab of the Langflow preview.
style={{ marginBottom: "20px", display: "flex", justifyContent: "center" }}
>
-
\ No newline at end of file
+
diff --git a/docs/docs/examples/combine-text.mdx b/docs/docs/examples/combine-text.mdx
index 0d7524a5b..5a4e86cf0 100644
--- a/docs/docs/examples/combine-text.mdx
+++ b/docs/docs/examples/combine-text.mdx
@@ -18,4 +18,4 @@ This component is available under the **Helpers** tab of the Langflow preview.
style={{ marginBottom: "20px", display: "flex", justifyContent: "center" }}
>
-
\ No newline at end of file
+
diff --git a/docs/docs/examples/create-record.mdx b/docs/docs/examples/create-record.mdx
index f94ba84bd..aa7a886f4 100644
--- a/docs/docs/examples/create-record.mdx
+++ b/docs/docs/examples/create-record.mdx
@@ -14,4 +14,4 @@ The **Create Record** component allows you to dynamically create a `Record` from
style={{ marginBottom: "20px", display: "flex", justifyContent: "center" }}
>
-
\ No newline at end of file
+
diff --git a/docs/docs/examples/pass.mdx b/docs/docs/examples/pass.mdx
index cdf1858d5..ddfe35cca 100644
--- a/docs/docs/examples/pass.mdx
+++ b/docs/docs/examples/pass.mdx
@@ -14,4 +14,4 @@ The **Pass** component enables you to ignore one input and move forward with ano
style={{ marginBottom: "20px", display: "flex", justifyContent: "center" }}
>
-
\ No newline at end of file
+
diff --git a/docs/docs/examples/store-message.mdx b/docs/docs/examples/store-message.mdx
index 610bf645c..75ff0bd46 100644
--- a/docs/docs/examples/store-message.mdx
+++ b/docs/docs/examples/store-message.mdx
@@ -14,4 +14,4 @@ The **Message History** component can then be used to retrieve stored messages.
style={{ marginBottom: "20px", display: "flex", justifyContent: "center" }}
>
-
\ No newline at end of file
+
diff --git a/docs/docs/examples/sub-flow.mdx b/docs/docs/examples/sub-flow.mdx
index ae7e5c9da..d2b9674ad 100644
--- a/docs/docs/examples/sub-flow.mdx
+++ b/docs/docs/examples/sub-flow.mdx
@@ -12,4 +12,4 @@ The **Sub Flow** component enables a user to select a previously built flow and
style={{ marginBottom: "20px", display: "flex", justifyContent: "center" }}
>
-
\ No newline at end of file
+
diff --git a/docs/docs/examples/text-operator.mdx b/docs/docs/examples/text-operator.mdx
index 5637dbc79..50d52fdbf 100644
--- a/docs/docs/examples/text-operator.mdx
+++ b/docs/docs/examples/text-operator.mdx
@@ -12,4 +12,4 @@ The **Text Operator** component simplifies logic. It evaluates the results from
style={{ marginBottom: "20px", display: "flex", justifyContent: "center" }}
>
-
\ No newline at end of file
+
diff --git a/docs/docs/getting-started/canvas.mdx b/docs/docs/getting-started/canvas.mdx
index b16807b66..5974f245b 100644
--- a/docs/docs/getting-started/canvas.mdx
+++ b/docs/docs/getting-started/canvas.mdx
@@ -56,7 +56,8 @@ Components are the building blocks of flows. They consist of inputs, outputs, an
During the flow creation process, you will notice handles (colored circles)
attached to one or both sides of a component. These handles represent the
- availability to connect to other components. Hover over a handle to see connection details.
+ availability to connect to other components. Hover over a handle to see
+ connection details.
@@ -85,6 +86,7 @@ Build the flow by clicking the **Playgr
Once the validation is complete, the status of each validated component should turn green ().
To debug, hover over the component status to see the outputs.
+
---
@@ -196,6 +198,7 @@ curl -X POST \
```
Result:
+
```
{"session_id":"f2eefd80-bb91-4190-9279-0d6ffafeaac4:53856a772b8e1cfcb3dd2e71576b5215399e95bae318d3c02101c81b7c252da3","outputs":[{"inputs":{"input_value":"is anybody there?"},"outputs":[{"results":{"result":"Arrr, me hearties! Aye, this be Captain [Your Name] speakin'. What be ye needin', matey?"},"artifacts":{"message":"Arrr, me hearties! Aye, this be Captain [Your Name] speakin'. What be ye needin', matey?","sender":"Machine","sender_name":"AI"},"messages":[{"message":"Arrr, me hearties! Aye, this be Captain [Your Name] speakin'. What be ye needin', matey?","sender":"Machine","sender_name":"AI","component_id":"ChatOutput-njtka"}],"component_display_name":"Chat Output","component_id":"ChatOutput-njtka"}]}]}%
```
@@ -231,9 +234,10 @@ A collection is a snapshot of flows available in a database.
Collections can be downloaded to local storage and uploaded for future use.
-
-
+
+
## Project
@@ -276,9 +280,3 @@ To see options for your project, in the upper left corner of the canvas, select
**Export** - Download your current project to your local machine as a `.json` file.
**Undo** or **Redo** - Undo or redo your last action.
-
-
-
-
-
-
diff --git a/docs/docs/getting-started/flows-components-collections.mdx b/docs/docs/getting-started/flows-components-collections.mdx
index 586f08192..335fb5c12 100644
--- a/docs/docs/getting-started/flows-components-collections.mdx
+++ b/docs/docs/getting-started/flows-components-collections.mdx
@@ -1,7 +1,7 @@
-import ThemedImage from '@theme/ThemedImage';
-import useBaseUrl from '@docusaurus/useBaseUrl';
-import ZoomableImage from '/src/theme/ZoomableImage.js';
-import ReactPlayer from 'react-player';
+import ThemedImage from "@theme/ThemedImage";
+import useBaseUrl from "@docusaurus/useBaseUrl";
+import ZoomableImage from "/src/theme/ZoomableImage.js";
+import ReactPlayer from "react-player";
# 🖥️ Flows, components, collections, and projects
@@ -17,10 +17,4 @@ A [project](#project) can be a component or a flow. Projects are saved as part o
For example, the **OpenAI LLM** is a **component** of the **Basic prompting** flow, and the **flow** is stored in a **collection**.
-
-
## Component
-
-
-
-
diff --git a/docs/docs/getting-started/install-langflow.mdx b/docs/docs/getting-started/install-langflow.mdx
index 836645a1e..4beb5e362 100644
--- a/docs/docs/getting-started/install-langflow.mdx
+++ b/docs/docs/getting-started/install-langflow.mdx
@@ -6,33 +6,40 @@ import Admonition from "@theme/Admonition";
# 📦 Install Langflow
- Langflow v1.0 alpha is also available in HuggingFace Spaces. [Clone the space using this link](https://huggingface.co/spaces/Langflow/Langflow-Preview?duplicate=true), to create your own Langflow workspace in minutes.
+ Langflow v1.0 alpha is also available in HuggingFace Spaces. [Clone the space
+ using this
+ link](https://huggingface.co/spaces/Langflow/Langflow-Preview?duplicate=true),
+ to create your own Langflow workspace in minutes.
Langflow requires [Python >=3.10](https://www.python.org/downloads/release/python-3100/) and [pip](https://pypi.org/project/pip/) or [pipx](https://pipx.pypa.io/stable/installation/) to be installed on your system.
Install Langflow with pip:
+
```bash
python -m pip install langflow -U
```
Install Langflow with pipx:
+
```bash
pipx install langflow --python python3.10 --fetch-missing-python
```
-Pipx can fetch the missing Python version for you with `--fetch-missing-python`, but you can also install the Python version manually.
+Pipx can fetch the missing Python version for you with `--fetch-missing-python`, but you can also install the Python version manually.
## Install Langflow pre-release
To install a pre-release version of Langflow:
pip:
+
```bash
python -m pip install langflow --pre --force-reinstall
```
pipx:
+
```bash
pipx install langflow --python python3.10 --fetch-missing-python --pip-args="--pre --force-reinstall"
```
@@ -52,11 +59,13 @@ python -m langflow --help
## ⛓️ Run Langflow
1. To run Langflow, enter the following command.
+
```bash
python -m langflow run
```
2. Confirm that a local Langflow instance starts by visiting `http://127.0.0.1:7860` in a Chromium-based browser.
+
```bash
│ Welcome to ⛓ Langflow │
│ │
@@ -83,4 +92,4 @@ You'll be presented with the following screen:
style={{ width: "100%", margin: "20px auto" }}
/>
-Name your Space, define the visibility (Public or Private), and click on **Duplicate Space** to start the installation process. When installation is finished, you'll be redirected to the Space's main page to start using Langflow right away!
\ No newline at end of file
+Name your Space, define the visibility (Public or Private), and click on **Duplicate Space** to start the installation process. When installation is finished, you'll be redirected to the Space's main page to start using Langflow right away!
diff --git a/docs/docs/getting-started/quickstart.mdx b/docs/docs/getting-started/quickstart.mdx
index ef7d373a6..3f02db27f 100644
--- a/docs/docs/getting-started/quickstart.mdx
+++ b/docs/docs/getting-started/quickstart.mdx
@@ -10,12 +10,15 @@ This guide demonstrates how to build a basic prompt flow and modify that prompt
## Prerequisites
-* [Langflow installed and running](./install-langflow.mdx)
+- [Langflow installed and running](./install-langflow.mdx)
-* [OpenAI API key](https://platform.openai.com)
+- [OpenAI API key](https://platform.openai.com)
- Langflow v1.0 alpha is also available in HuggingFace Spaces. [Clone the space using this link](https://huggingface.co/spaces/Langflow/Langflow-Preview?duplicate=true) to create your own Langflow workspace in minutes.
+ Langflow v1.0 alpha is also available in HuggingFace Spaces. [Clone the space
+ using this
+ link](https://huggingface.co/spaces/Langflow/Langflow-Preview?duplicate=true)
+ to create your own Langflow workspace in minutes.
## Hello World - Basic Prompting
@@ -44,25 +47,25 @@ Examine the **Prompt** component. The **Template** field instructs the LLM to `A
This should be interesting...
4. To create an environment variable for the **OpenAI** component, in the **OpenAI API Key** field, click the **Globe** button, and then click **Add New Variable**.
- 1. In the **Variable Name** field, enter `openai_api_key`.
- 2. In the **Value** field, paste your OpenAI API Key (`sk-...`).
- 3. Click **Save Variable**.
+ 1. In the **Variable Name** field, enter `openai_api_key`.
+ 2. In the **Value** field, paste your OpenAI API Key (`sk-...`).
+ 3. Click **Save Variable**.
## Run the basic prompting flow
1. Click the **Run** button.
-The **Interaction Panel** opens, where you can chat with your bot.
+ The **Interaction Panel** opens, where you can chat with your bot.
2. Type a message and press Enter.
-And... Ahoy! 🏴☠️
-The bot responds in a piratical manner!
+ And... Ahoy! 🏴☠️
+ The bot responds in a piratical manner!
## Modify the prompt for a different result
1. To modify your prompt results, in the **Prompt** template, click the **Template** field.
-The **Edit Prompt** window opens.
+ The **Edit Prompt** window opens.
2. Change `Answer the user as if you were a pirate` to a different character, perhaps `Answer the user as if you were Harold Abelson.`
3. Run the basic prompting flow again.
-The response will be markedly different.
+ The response will be markedly different.
## Next steps
@@ -72,8 +75,6 @@ By adding Langflow components to your flow, you can create all sorts of interest
Here are a couple of examples:
-* [Memory chatbot](/starter-projects/memory-chatbot.mdx)
-* [Blog writer](/starter-projects/blog-writer.mdx)
-* [Document QA](/starter-projects/document-qa.mdx)
-
-
+- [Memory chatbot](/starter-projects/memory-chatbot.mdx)
+- [Blog writer](/starter-projects/blog-writer.mdx)
+- [Document QA](/starter-projects/document-qa.mdx)
diff --git a/docs/docs/index.mdx b/docs/docs/index.mdx
index 54960502b..e762142f0 100644
--- a/docs/docs/index.mdx
+++ b/docs/docs/index.mdx
@@ -29,7 +29,10 @@ Its intuitive interface allows for easy manipulation of AI building blocks, enab
- [Langflow Canvas](/getting-started/canvas) - Learn more about the Langflow canvas.
- Langflow v1.0 alpha is also available in HuggingFace Spaces. [Clone the space using this link](https://huggingface.co/spaces/Langflow/Langflow-Preview?duplicate=true) to create your own Langflow workspace in minutes.
+ Langflow v1.0 alpha is also available in HuggingFace Spaces. [Clone the space
+ using this
+ link](https://huggingface.co/spaces/Langflow/Langflow-Preview?duplicate=true)
+ to create your own Langflow workspace in minutes.
## Learn more about Langflow 1.0
diff --git a/docs/docs/integrations/notion/add-content-to-page.md b/docs/docs/integrations/notion/add-content-to-page.md
index 83b395fd0..243c09d81 100644
--- a/docs/docs/integrations/notion/add-content-to-page.md
+++ b/docs/docs/integrations/notion/add-content-to-page.md
@@ -16,7 +16,7 @@ The `AddContentToPage` component enables you to:
- Convert markdown text to Notion blocks.
- Append the converted blocks to a specified Notion page.
- Seamlessly integrate Notion content creation into Langflow workflows.
-
+
## Component Usage
@@ -105,12 +105,12 @@ class NotionPageCreator(CustomComponent):
Example of using the `AddContentToPage` component in a Langflow flow using Markdown as input:
In this example, the `AddContentToPage` component connects to a `MarkdownLoader` component to provide the markdown text input. The converted Notion blocks are appended to the specified Notion page using the provided `block_id` and `notion_secret`.
@@ -131,8 +131,8 @@ The `AddContentToPage` component is a powerful tool for integrating Notion conte
## Troubleshooting
If you encounter any issues while using the `AddContentToPage` component, consider the following:
+
- Verify the Notion integration token’s validity and permissions.
- Check the Notion API documentation for updates.
- Ensure markdown text is properly formatted.
- Double-check the `block_id` for correctness.
-
diff --git a/docs/docs/integrations/notion/intro.md b/docs/docs/integrations/notion/intro.md
index 11b8b7823..293038d4f 100644
--- a/docs/docs/integrations/notion/intro.md
+++ b/docs/docs/integrations/notion/intro.md
@@ -8,12 +8,12 @@ import ZoomableImage from "/src/theme/ZoomableImage.js";
The Notion integration in Langflow enables seamless connectivity with Notion databases, pages, and users, facilitating automation and improving productivity.
####
Download Notion Components Bundle
diff --git a/docs/docs/integrations/notion/list-database-properties.md b/docs/docs/integrations/notion/list-database-properties.md
index 830ea3324..c41159893 100644
--- a/docs/docs/integrations/notion/list-database-properties.md
+++ b/docs/docs/integrations/notion/list-database-properties.md
@@ -41,7 +41,7 @@ class NotionDatabaseProperties(CustomComponent):
description = "Retrieve properties of a Notion database."
documentation: str = "https://docs.langflow.org/integrations/notion/list-database-properties"
icon = "NotionDirectoryLoader"
-
+
def build_config(self):
return {
"database_id": {
@@ -80,6 +80,7 @@ class NotionDatabaseProperties(CustomComponent):
```
## Example Usage
+
Here's an example of how you can use the `NotionDatabaseProperties` component in a Langflow flow:
@@ -110,6 +111,7 @@ Feel free to explore the capabilities of the `NotionDatabaseProperties` componen
## Troubleshooting
If you encounter any issues while using the `NotionDatabaseProperties` component, consider the following:
+
- Verify that the Notion integration token is valid and has the required permissions.
- Check the database ID to ensure it matches the intended Notion database.
-- Inspect the response from the Notion API for any error messages or status codes that may indicate the cause of the issue.
\ No newline at end of file
+- Inspect the response from the Notion API for any error messages or status codes that may indicate the cause of the issue.
diff --git a/docs/docs/integrations/notion/list-pages.md b/docs/docs/integrations/notion/list-pages.md
index 3e219870e..ea1b04950 100644
--- a/docs/docs/integrations/notion/list-pages.md
+++ b/docs/docs/integrations/notion/list-pages.md
@@ -140,16 +140,17 @@ class NotionListPages(CustomComponent):
## Example Usage
+
Here's an example of how you can use the `NotionListPages` component in a Langflow flow and passing to the Prompt component:
In this example, the `NotionListPages` component is used to retrieve specific pages from a Notion database based on the provided filters and sorting options. The retrieved data can then be processed further in the subsequent components of the flow.
@@ -157,7 +158,7 @@ In this example, the `NotionListPages` component is used to retrieve specific pa
## Best Practices
- When using the `NotionListPages
+When using the `NotionListPages
` component, consider the following best practices:
- Ensure that you have a valid Notion integration token with the necessary permissions to query the desired database.
@@ -171,7 +172,7 @@ We encourage you to explore the capabilities of the `NotionListPages
## Troubleshooting
- If you encounter any issues while using the `NotionListPages` component, consider the following:
+If you encounter any issues while using the `NotionListPages` component, consider the following:
- Double-check that the `notion_secret` and `database_id` are correct and valid.
- Verify that the `query_payload` JSON string is properly formatted and contains valid filtering and sorting options.
diff --git a/docs/docs/integrations/notion/list-users.md b/docs/docs/integrations/notion/list-users.md
index 90761239a..c22c20ca8 100644
--- a/docs/docs/integrations/notion/list-users.md
+++ b/docs/docs/integrations/notion/list-users.md
@@ -15,7 +15,7 @@ The `NotionUserList` component retrieves users from Notion. It provides a conven
- Retrieve user data from Notion
- Access user information such as ID, type, name, and avatar URL
- Integrate Notion user data seamlessly into your Langflow workflows
-
+
## Component Usage
@@ -94,34 +94,34 @@ class NotionUserList(CustomComponent):
```
## Example Usage
+
Here's an example of how you can use the `NotionUserList` component in a Langflow flow and passing the outputs to the Prompt component:
## Best Practices
- When using the `NotionUserList` component, consider the following best practices:
+When using the `NotionUserList` component, consider the following best practices:
- Ensure that you have a valid Notion integration token with the necessary permissions to retrieve user data.
- Handle the retrieved user data securely and in compliance with Notion's API usage guidelines.
The `NotionUserList` component provides a seamless way to integrate Notion user data into your Langflow workflows. By leveraging this component, you can easily retrieve and utilize user information from Notion, enhancing the capabilities of your Langflow applications. Feel free to explore and experiment with the `NotionUserList` component to unlock new possibilities in your Langflow projects!
-
## Troubleshooting
- If you encounter any issues while using the `NotionUserList` component, consider the following:
+If you encounter any issues while using the `NotionUserList` component, consider the following:
- Double-check that your Notion integration token is valid and has the required permissions.
- Verify that you have installed the necessary dependencies (`requests`) for the component to function properly.
-- Check the Notion API documentation for any updates or changes that may affect the component's functionality.
\ No newline at end of file
+- Check the Notion API documentation for any updates or changes that may affect the component's functionality.
diff --git a/docs/docs/integrations/notion/page-content-viewer.md b/docs/docs/integrations/notion/page-content-viewer.md
index a38c05fd0..f4eeba052 100644
--- a/docs/docs/integrations/notion/page-content-viewer.md
+++ b/docs/docs/integrations/notion/page-content-viewer.md
@@ -11,7 +11,7 @@ The `NotionPageContent` component retrieves the content of a Notion page as plai
- The `NotionPageContent` component enables you to:
+The `NotionPageContent` component enables you to:
- Retrieve the content of a Notion page as plain text
- Extract text from various block types, including paragraphs, headings, lists, and more
@@ -114,18 +114,18 @@ class NotionPageContent(CustomComponent):
Here's an example of how you can use the `NotionPageContent` component in a Langflow flow:
## Best Practices
- When using the `NotionPageContent` component, consider the following best practices:
+When using the `NotionPageContent` component, consider the following best practices:
- Ensure that you have the necessary permissions to access the Notion page you want to retrieve.
- Keep your Notion integration token secure and avoid sharing it publicly.
@@ -135,7 +135,7 @@ The `NotionPageContent` component provides a seamless way to integrate Notion pa
## Troubleshooting
- If you encounter any issues while using the `NotionPageContent` component, consider the following:
+If you encounter any issues while using the `NotionPageContent` component, consider the following:
- Double-check that you have provided the correct Notion page ID.
- Verify that your Notion integration token is valid and has the necessary permissions.
diff --git a/docs/docs/integrations/notion/page-create.md b/docs/docs/integrations/notion/page-create.md
index 0269096b9..f942f257b 100644
--- a/docs/docs/integrations/notion/page-create.md
+++ b/docs/docs/integrations/notion/page-create.md
@@ -97,16 +97,17 @@ class NotionPageCreator(CustomComponent):
```
## Example Usage
+
Here's an example of how to use the `NotionPageCreator` component in a Langflow flow:
@@ -124,6 +125,7 @@ The `NotionPageCreator` component simplifies the process of creating pages in a
## Troubleshooting
If you encounter any issues while using the `NotionPageCreator` component, consider the following:
+
- Double-check that the `database_id` and `notion_secret` inputs are correct and valid.
- Verify that the `properties` input is properly formatted as a JSON string and matches the structure of your Notion database.
-- Check the Notion API documentation for any updates or changes that may affect the component's functionality.
\ No newline at end of file
+- Check the Notion API documentation for any updates or changes that may affect the component's functionality.
diff --git a/docs/docs/integrations/notion/page-update.md b/docs/docs/integrations/notion/page-update.md
index 3389f64d3..0370a2b3a 100644
--- a/docs/docs/integrations/notion/page-update.md
+++ b/docs/docs/integrations/notion/page-update.md
@@ -109,12 +109,12 @@ Let's break down the key parts of this component:
Here's an example of how to use the `NotionPageUpdate` component in a Langflow flow using:
@@ -128,7 +128,6 @@ When using the `NotionPageUpdate` component, consider the following best practic
By leveraging the `NotionPageUpdate` component in Langflow, you can easily integrate updating Notion page properties into your language model workflows and build powerful applications that extend Langflow's capabilities.
-
## Troubleshooting
If you encounter any issues while using the `NotionPageUpdate` component, consider the following:
diff --git a/docs/docs/integrations/notion/search.md b/docs/docs/integrations/notion/search.md
index 3ff7472dc..a972bffc0 100644
--- a/docs/docs/integrations/notion/search.md
+++ b/docs/docs/integrations/notion/search.md
@@ -146,16 +146,17 @@ class NotionSearch(CustomComponent):
```
## Example Usage
+
Here's an example of how you can use the `NotionSearch` component in a Langflow flow:
In this example, the `NotionSearch` component is used to search for pages and databases in Notion based on the provided query and filter criteria. The retrieved data can then be processed further in the subsequent components of the flow.
diff --git a/docs/docs/integrations/notion/setup.md b/docs/docs/integrations/notion/setup.md
index 9511d9c81..72bb8f3b4 100644
--- a/docs/docs/integrations/notion/setup.md
+++ b/docs/docs/integrations/notion/setup.md
@@ -76,4 +76,3 @@ Refer to the individual component documentation for more details on how to use e
- [Notion Integration Capabilities](https://developers.notion.com/reference/capabilities)
If you encounter any issues or have questions, please reach out to our support team or consult the Langflow community forums.
-
diff --git a/docs/docs/migration/possible-installation-issues.mdx b/docs/docs/migration/possible-installation-issues.mdx
index 2590d0b8a..a012a1c09 100644
--- a/docs/docs/migration/possible-installation-issues.mdx
+++ b/docs/docs/migration/possible-installation-issues.mdx
@@ -25,11 +25,11 @@ ModuleNotFoundError: No module named 'langflow.__main__'
There are two possible reasons for this error:
1. You've installed Langflow using _`pip install langflow`_ but you already had a previous version of Langflow installed in your system.
- In this case, you might be running the wrong executable.
- To solve this issue, run the correct executable by running _`python -m langflow run`_ instead of _`langflow run`_.
- If that doesn't work, try uninstalling and reinstalling Langflow with _`python -m pip install langflow --pre -U`_.
+ In this case, you might be running the wrong executable.
+ To solve this issue, run the correct executable by running _`python -m langflow run`_ instead of _`langflow run`_.
+ If that doesn't work, try uninstalling and reinstalling Langflow with _`python -m pip install langflow --pre -U`_.
2. Some version conflicts might have occurred during the installation process.
- Run _`python -m pip install langflow --pre -U --force-reinstall`_ to reinstall Langflow and its dependencies.
+ Run _`python -m pip install langflow --pre -U --force-reinstall`_ to reinstall Langflow and its dependencies.
## _`Something went wrong running migrations. Please, run 'langflow migration --fix'`_
@@ -45,4 +45,3 @@ There are two possible reasons for this error:
This error can occur during Langflow upgrades when the new version can't override `langflow-pre.db` in `.cache/langflow/`. Clearing the cache removes this file but will also erase your settings.
If you wish to retain your files, back them up before clearing the folder.
-
diff --git a/docs/docs/starter-projects/basic-prompting.mdx b/docs/docs/starter-projects/basic-prompting.mdx
index 6fb7391e2..26b054bcc 100644
--- a/docs/docs/starter-projects/basic-prompting.mdx
+++ b/docs/docs/starter-projects/basic-prompting.mdx
@@ -14,12 +14,15 @@ This article demonstrates how to use Langflow's prompt tools to issue basic prom
## Prerequisites
-* [Langflow installed and running](../getting-started/install-langflow.mdx)
+- [Langflow installed and running](../getting-started/install-langflow.mdx)
-* [OpenAI API key created](https://platform.openai.com)
+- [OpenAI API key created](https://platform.openai.com)
- Langflow v1.0 alpha is also available in HuggingFace Spaces. [Clone the space using this link](https://huggingface.co/spaces/Langflow/Langflow-Preview?duplicate=true) to create your own Langflow workspace in minutes.
+ Langflow v1.0 alpha is also available in HuggingFace Spaces. [Clone the space
+ using this
+ link](https://huggingface.co/spaces/Langflow/Langflow-Preview?duplicate=true)
+ to create your own Langflow workspace in minutes.
## Create the basic prompting project
@@ -42,25 +45,21 @@ Examine the **Prompt** component. The **Template** field instructs the LLM to `A
This should be interesting...
4. To create an environment variable for the **OpenAI** component, in the **OpenAI API Key** field, click the **Globe** button, and then click **Add New Variable**.
- 1. In the **Variable Name** field, enter `openai_api_key`.
- 2. In the **Value** field, paste your OpenAI API Key (`sk-...`).
- 3. Click **Save Variable**.
+ 1. In the **Variable Name** field, enter `openai_api_key`.
+ 2. In the **Value** field, paste your OpenAI API Key (`sk-...`).
+ 3. Click **Save Variable**.
## Run the basic prompting flow
1. Click the **Run** button.
-The **Interaction Panel** opens, where you can converse with your bot.
+ The **Interaction Panel** opens, where you can converse with your bot.
2. Type a message and press Enter.
-The bot responds in a markedly piratical manner!
+ The bot responds in a markedly piratical manner!
## Modify the prompt for a different result
1. To modify your prompt results, in the **Prompt** template, click the **Template** field.
-The **Edit Prompt** window opens.
+ The **Edit Prompt** window opens.
2. Change `Answer the user as if you were a pirate` to a different character, perhaps `Answer the user as if you were Harold Abelson.`
3. Run the basic prompting flow again.
-The response will be markedly different.
-
-
-
-
+ The response will be markedly different.
diff --git a/docs/docs/starter-projects/blog-writer.mdx b/docs/docs/starter-projects/blog-writer.mdx
index 0e8047fd6..9380bf114 100644
--- a/docs/docs/starter-projects/blog-writer.mdx
+++ b/docs/docs/starter-projects/blog-writer.mdx
@@ -10,12 +10,15 @@ Build a blog writer with OpenAI that uses URLs for reference content.
## Prerequisites
-* [Langflow installed and running](../getting-started/install-langflow.mdx)
+- [Langflow installed and running](../getting-started/install-langflow.mdx)
-* [OpenAI API key created](https://platform.openai.com)
+- [OpenAI API key created](https://platform.openai.com)
- Langflow v1.0 alpha is also available in HuggingFace Spaces. [Clone the space using this link](https://huggingface.co/spaces/Langflow/Langflow-Preview?duplicate=true) to create your own Langflow workspace in minutes.
+ Langflow v1.0 alpha is also available in HuggingFace Spaces. [Clone the space
+ using this
+ link](https://huggingface.co/spaces/Langflow/Langflow-Preview?duplicate=true)
+ to create your own Langflow workspace in minutes.
## Create the Blog Writer project
@@ -36,6 +39,7 @@ Build a blog writer with OpenAI that uses URLs for reference content.
This flow creates a one-shot prompt flow with **Prompt**, **OpenAI**, and **Chat Output** components, and augments the flow with reference content and instructions from the **URL** and **Instructions** components.
The **Prompt** component's default **Template** field looks like this:
+
```bash
Reference 1:
@@ -59,16 +63,16 @@ The `{instructions}` value is received from the **Value** field of the **Instruc
The `reference_1` and `reference_2` values are received from the **URL** fields of the **URL** components.
4. To create an environment variable for the **OpenAI** component, in the **OpenAI API Key** field, click the **Globe** button, and then click **Add New Variable**.
- 1. In the **Variable Name** field, enter `openai_api_key`.
- 2. In the **Value** field, paste your OpenAI API Key (`sk-...`).
- 3. Click **Save Variable**.
+ 1. In the **Variable Name** field, enter `openai_api_key`.
+ 2. In the **Value** field, paste your OpenAI API Key (`sk-...`).
+ 3. Click **Save Variable**.
## Run the Blog Writer flow
1. Click the **Run** button.
-The **Interaction Panel** opens, where you can run your one-shot flow.
+ The **Interaction Panel** opens, where you can run your one-shot flow.
2. Click the **Lighting Bolt** icon to run your flow.
3. The **OpenAI** component constructs a blog post with the **URL** items as context.
-The default **URL** values are for web pages at `promptingguide.ai`, so your blog post will be about prompting LLMs.
+ The default **URL** values are for web pages at `promptingguide.ai`, so your blog post will be about prompting LLMs.
-To write about something different, change the values in the **URL** components, and see what the LLM constructs.
\ No newline at end of file
+To write about something different, change the values in the **URL** components, and see what the LLM constructs.
diff --git a/docs/docs/starter-projects/document-qa.mdx b/docs/docs/starter-projects/document-qa.mdx
index 5e5377355..ddbcd901a 100644
--- a/docs/docs/starter-projects/document-qa.mdx
+++ b/docs/docs/starter-projects/document-qa.mdx
@@ -10,12 +10,15 @@ Build a question-and-answer chatbot with a document loaded from local memory.
## Prerequisites
-* [Langflow installed and running](../getting-started/install-langflow.mdx)
+- [Langflow installed and running](../getting-started/install-langflow.mdx)
-* [OpenAI API key created](https://platform.openai.com)
+- [OpenAI API key created](https://platform.openai.com)
- Langflow v1.0 alpha is also available in HuggingFace Spaces. [Clone the space using this link](https://huggingface.co/spaces/Langflow/Langflow-Preview?duplicate=true) to create your own Langflow workspace in minutes.
+ Langflow v1.0 alpha is also available in HuggingFace Spaces. [Clone the space
+ using this
+ link](https://huggingface.co/spaces/Langflow/Langflow-Preview?duplicate=true)
+ to create your own Langflow workspace in minutes.
## Create the Document QA project
@@ -39,24 +42,27 @@ The **Prompt** component is instructed to answer questions based on the contents
Including a file with the prompt gives the **OpenAI** component context it may not otherwise have access to.
4. To create an environment variable for the **OpenAI** component, in the **OpenAI API Key** field, click the **Globe** button, and then click **Add New Variable**.
- 1. In the **Variable Name** field, enter `openai_api_key`.
- 2. In the **Value** field, paste your OpenAI API Key (`sk-...`).
- 3. Click **Save Variable**.
+
+ 1. In the **Variable Name** field, enter `openai_api_key`.
+ 2. In the **Value** field, paste your OpenAI API Key (`sk-...`).
+ 3. Click **Save Variable**.
5. To select a document to load, in the **Files** component, click within the **Path** field.
- 1. Select a local file, and then click **Open**.
- 2. The file name appears in the field.
-
- The file must be of an extension type listed [here](https://github.com/langflow-ai/langflow/blob/dev/src/backend/base/langflow/base/data/utils.py#L13).
-
+ 1. Select a local file, and then click **Open**.
+ 2. The file name appears in the field.
+
+ The file must be of an extension type listed
+ [here](https://github.com/langflow-ai/langflow/blob/dev/src/backend/base/langflow/base/data/utils.py#L13).
+
## Run the Document QA flow
1. Click the **Run** button.
-The **Interaction Panel** opens, where you can converse with your bot.
+ The **Interaction Panel** opens, where you can converse with your bot.
2. Type a message and press Enter.
-For this example, we loaded an error log `.txt` file and asked, "What went wrong?"
-The bot responded:
+ For this example, we loaded an error log `.txt` file and asked, "What went wrong?"
+ The bot responded:
+
```
The issue occurred during the execution of migrations in the application. Specifically, an error was raised by the Alembic library, indicating that new upgrade operations were detected that had not been accounted for in the existing migration scripts. The operation in question involved modifying the nullable property of a column (apikey, created_at) in the database, with details about the existing type (DATETIME()), existing server default, and other properties.
```
diff --git a/docs/docs/starter-projects/memory-chatbot.mdx b/docs/docs/starter-projects/memory-chatbot.mdx
index 86c64d368..8e38ca3e0 100644
--- a/docs/docs/starter-projects/memory-chatbot.mdx
+++ b/docs/docs/starter-projects/memory-chatbot.mdx
@@ -10,12 +10,15 @@ This flow extends the [basic prompting flow](./basic-prompting.mdx) to include c
## Prerequisites
-* [Langflow installed and running](../getting-started/install-langflow.mdx)
+- [Langflow installed and running](../getting-started/install-langflow.mdx)
-* [OpenAI API key created](https://platform.openai.com)
+- [OpenAI API key created](https://platform.openai.com)
- Langflow v1.0 alpha is also available in HuggingFace Spaces. [Clone the space using this link](https://huggingface.co/spaces/Langflow/Langflow-Preview?duplicate=true) to create your own Langflow workspace in minutes.
+ Langflow v1.0 alpha is also available in HuggingFace Spaces. [Clone the space
+ using this
+ link](https://huggingface.co/spaces/Langflow/Langflow-Preview?duplicate=true)
+ to create your own Langflow workspace in minutes.
## Create the memory chatbot project
@@ -43,16 +46,16 @@ This chatbot is augmented with the **Chat Memory** component, which stores messa
The **Chat History** component gives the **OpenAI** component a memory of previous questions.
4. To create an environment variable for the **OpenAI** component, in the **OpenAI API Key** field, click the **Globe** button, and then click **Add New Variable**.
- 1. In the **Variable Name** field, enter `openai_api_key`.
- 2. In the **Value** field, paste your OpenAI API Key (`sk-...`).
- 3. Click **Save Variable**.
+ 1. In the **Variable Name** field, enter `openai_api_key`.
+ 2. In the **Value** field, paste your OpenAI API Key (`sk-...`).
+ 3. Click **Save Variable**.
## Run the memory chatbot flow
1. Click the **Run** button.
-The **Interaction Panel** opens, where you can converse with your bot.
+ The **Interaction Panel** opens, where you can converse with your bot.
2. Type a message and press Enter.
-The bot will respond according to the template in the **Prompt** component.
+ The bot will respond according to the template in the **Prompt** component.
3. Type more questions. In the **Outputs** log, your queries are logged in order. Up to 5 queries are stored by default. Try asking `What is the first subject I asked you about?` to see where the LLM's memory disappears.
## Modify the Session ID field to have multiple conversations
@@ -65,11 +68,11 @@ You can demonstrate this by modifying the **Session ID** value to switch between
1. In the **Session ID** field of the **Chat Memory** and **Chat Input** components, change the **Session ID** value from `MySessionID` to `AnotherSessionID`.
2. Click the **Run** button to run your flow.
-In the **Interaction Panel**, you will have a new conversation. (You may need to clear the cache with the **Eraser** button).
+ In the **Interaction Panel**, you will have a new conversation. (You may need to clear the cache with the **Eraser** button).
3. Type a few questions to your bot.
4. In the **Session ID** field of the **Chat Memory** and **Chat Input** components, change the **Session ID** value back to `MySessionID`.
5. Run your flow.
-The **Outputs** log of the **Interaction Panel** displays the history from your initial chat with `MySessionID`.
+ The **Outputs** log of the **Interaction Panel** displays the history from your initial chat with `MySessionID`.
## Store Session ID as a Langflow variable
@@ -79,4 +82,3 @@ To store **Session ID** as a Langflow variable, in the **Session ID** field, cli
2. In the **Value** field, enter a value like `1B5EBD79-6E9C-4533-B2C8-7E4FF29E983B`.
3. Click **Save Variable**.
4. Apply this variable to **Chat Input**.
-
diff --git a/docs/docs/starter-projects/vector-store-rag.mdx b/docs/docs/starter-projects/vector-store-rag.mdx
index ddb0a1d46..d0054e6c4 100644
--- a/docs/docs/starter-projects/vector-store-rag.mdx
+++ b/docs/docs/starter-projects/vector-store-rag.mdx
@@ -17,16 +17,19 @@ We've chosen [Astra DB](https://astra.datastax.com/signup?utm_source=langflow-pr
## Prerequisites
- Langflow v1.0 alpha is also available in HuggingFace Spaces. [Clone the space using this link](https://huggingface.co/spaces/Langflow/Langflow-Preview?duplicate=true) to create your own Langflow workspace in minutes.
+ Langflow v1.0 alpha is also available in HuggingFace Spaces. [Clone the space
+ using this
+ link](https://huggingface.co/spaces/Langflow/Langflow-Preview?duplicate=true)
+ to create your own Langflow workspace in minutes.
-* [Langflow installed and running](../getting-started/install-langflow.mdx)
+- [Langflow installed and running](../getting-started/install-langflow.mdx)
-* [OpenAI API key](https://platform.openai.com)
+- [OpenAI API key](https://platform.openai.com)
-* [An Astra DB vector database created](https://docs.datastax.com/en/astra-db-serverless/get-started/quickstart.html) with:
- * Application token (`AstraCS:WSnyFUhRxsrg…`)
- * API endpoint (`https://ASTRA_DB_ID-ASTRA_DB_REGION.apps.astra.datastax.com`)
+- [An Astra DB vector database created](https://docs.datastax.com/en/astra-db-serverless/get-started/quickstart.html) with:
+ - Application token (`AstraCS:WSnyFUhRxsrg…`)
+ - API endpoint (`https://ASTRA_DB_ID-ASTRA_DB_REGION.apps.astra.datastax.com`)
## Create the vector store RAG project
@@ -49,38 +52,40 @@ The **ingestion** flow (bottom of the screen) populates the vector store with da
It ingests data from a file (**File**), splits it into chunks (**Recursive Character Text Splitter**), indexes it in Astra DB (**Astra DB**), and computes embeddings for the chunks (**OpenAI Embeddings**).
This forms a "brain" for the query flow.
-The **query** flow (top of the screen) allows users to chat with the embedded vector store data. It's a little more complex:
+The **query** flow (top of the screen) allows users to chat with the embedded vector store data. It's a little more complex:
-* **Chat Input** component defines where to put the user input coming from the Playground.
-* **OpenAI Embeddings** component generates embeddings from the user input.
-* **Astra DB Search** component retrieves the most relevant Records from the Astra DB database.
-* **Text Output** component turns the Records into Text by concatenating them and also displays it in the Playground.
-* **Prompt** component takes in the user input and the retrieved Records as text and builds a prompt for the OpenAI model.
-* **OpenAI** component generates a response to the prompt.
-* **Chat Output** component displays the response in the Playground.
+- **Chat Input** component defines where to put the user input coming from the Playground.
+- **OpenAI Embeddings** component generates embeddings from the user input.
+- **Astra DB Search** component retrieves the most relevant Records from the Astra DB database.
+- **Text Output** component turns the Records into Text by concatenating them and also displays it in the Playground.
+- **Prompt** component takes in the user input and the retrieved Records as text and builds a prompt for the OpenAI model.
+- **OpenAI** component generates a response to the prompt.
+- **Chat Output** component displays the response in the Playground.
4. To create an environment variable for the **OpenAI** component, in the **OpenAI API Key** field, click the **Globe** button, and then click **Add New Variable**.
- 1. In the **Variable Name** field, enter `openai_api_key`.
- 2. In the **Value** field, paste your OpenAI API Key (`sk-...`).
- 3. Click **Save Variable**.
-4. To create environment variables for the **Astra DB** and **Astra DB Search** components:
- 1. In the **Token** field, click the **Globe** button, and then click **Add New Variable**.
- 2. In the **Variable Name** field, enter `astra_token`.
- 3. In the **Value** field, paste your Astra application token (`AstraCS:WSnyFUhRxsrg…`).
- 4. Click **Save Variable**.
- 5. Repeat the above steps for the **API Endpoint** field, pasting your Astra API Endpoint instead (`https://ASTRA_DB_ID-ASTRA_DB_REGION.apps.astra.datastax.com`).
- 6. Add the global variable to both the **Astra DB** and **Astra DB Search** components.
+ 1. In the **Variable Name** field, enter `openai_api_key`.
+ 2. In the **Value** field, paste your OpenAI API Key (`sk-...`).
+ 3. Click **Save Variable**.
+
+5. To create environment variables for the **Astra DB** and **Astra DB Search** components:
+ 1. In the **Token** field, click the **Globe** button, and then click **Add New Variable**.
+ 2. In the **Variable Name** field, enter `astra_token`.
+ 3. In the **Value** field, paste your Astra application token (`AstraCS:WSnyFUhRxsrg…`).
+ 4. Click **Save Variable**.
+ 5. Repeat the above steps for the **API Endpoint** field, pasting your Astra API Endpoint instead (`https://ASTRA_DB_ID-ASTRA_DB_REGION.apps.astra.datastax.com`).
+ 6. Add the global variable to both the **Astra DB** and **Astra DB Search** components.
## Run the vector store RAG flow
1. Click the **Playground** button.
-The **Playground** opens, where you can chat with your data.
+ The **Playground** opens, where you can chat with your data.
2. Type a message and press Enter. (Try something like "What topics do you know about?")
3. The bot will respond with a summary of the data you've embedded.
For example, we embedded a PDF of an engine maintenance manual and asked, "How do I change the oil?"
The bot responds:
+
```
To change the oil in the engine, follow these steps:
@@ -102,7 +107,3 @@ You should use a 3/8 inch wrench to remove the oil drain cap.
```
This is the size the engine manual lists as well. This confirms our flow works, because the query returns the unique knowledge we embedded from the Astra vector store.
-
-
-
-
diff --git a/docs/static/data/AstraDB-RAG-Flows.json b/docs/static/data/AstraDB-RAG-Flows.json
index 10dafa85f..d8bd23eb2 100644
--- a/docs/static/data/AstraDB-RAG-Flows.json
+++ b/docs/static/data/AstraDB-RAG-Flows.json
@@ -1,3403 +1,3147 @@
{
- "id": "51e2b78a-199b-4054-9f32-e288eef6924c",
- "data": {
- "nodes": [
- {
- "id": "ChatInput-yxMKE",
- "type": "genericNode",
- "position": {
- "x": 1195.5276981160775,
- "y": 209.421875
- },
- "data": {
- "type": "ChatInput",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n\n def build_config(self):\n build_config = super().build_config()\n build_config[\"input_value\"] = {\n \"input_types\": [],\n \"display_name\": \"Message\",\n \"multiline\": True,\n }\n\n return build_config\n\n def build(\n self,\n sender: Optional[str] = \"User\",\n sender_name: Optional[str] = \"User\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n ) -> Union[Text, Record]:\n return super().build(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n )\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "input_value": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Message",
- "advanced": false,
- "input_types": [],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "value": "what is a line"
- },
- "return_record": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "return_record",
- "display_name": "Return Record",
- "advanced": true,
- "dynamic": false,
- "info": "Return the message as a record containing the sender, sender_name, and session_id.",
- "load_from_db": false,
- "title_case": false
- },
- "sender": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "User",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "Machine",
- "User"
- ],
- "name": "sender",
- "display_name": "Sender Type",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "sender_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "User",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "sender_name",
- "display_name": "Sender Name",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "session_id": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "session_id",
- "display_name": "Session ID",
- "advanced": true,
- "dynamic": false,
- "info": "If provided, the message will be stored in the memory.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Get chat inputs from the Playground.",
- "icon": "ChatInput",
- "base_classes": [
- "Text",
- "str",
- "object",
- "Record"
- ],
- "display_name": "Chat Input",
- "documentation": "",
- "custom_fields": {
- "sender": null,
- "sender_name": null,
- "input_value": null,
- "session_id": null,
- "return_record": null
- },
- "output_types": [
- "Text",
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "ChatInput-yxMKE"
- },
- "selected": false,
- "width": 384,
- "height": 383
+ "id": "51e2b78a-199b-4054-9f32-e288eef6924c",
+ "data": {
+ "nodes": [
+ {
+ "id": "ChatInput-yxMKE",
+ "type": "genericNode",
+ "position": {
+ "x": 1195.5276981160775,
+ "y": 209.421875
+ },
+ "data": {
+ "type": "ChatInput",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n\n def build_config(self):\n build_config = super().build_config()\n build_config[\"input_value\"] = {\n \"input_types\": [],\n \"display_name\": \"Message\",\n \"multiline\": True,\n }\n\n return build_config\n\n def build(\n self,\n sender: Optional[str] = \"User\",\n sender_name: Optional[str] = \"User\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n ) -> Union[Text, Record]:\n return super().build(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n )\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "input_value": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Message",
+ "advanced": false,
+ "input_types": [],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "value": "what is a line"
+ },
+ "return_record": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "return_record",
+ "display_name": "Return Record",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Return the message as a record containing the sender, sender_name, and session_id.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "sender": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "User",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["Machine", "User"],
+ "name": "sender",
+ "display_name": "Sender Type",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "sender_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "User",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "sender_name",
+ "display_name": "Sender Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "session_id": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "session_id",
+ "display_name": "Session ID",
+ "advanced": true,
+ "dynamic": false,
+ "info": "If provided, the message will be stored in the memory.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
},
- {
- "id": "TextOutput-BDknO",
- "type": "genericNode",
- "position": {
- "x": 2322.600672827879,
- "y": 604.9467307442569
- },
- "data": {
- "type": "TextOutput",
- "node": {
- "template": {
- "input_value": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Value",
- "advanced": false,
- "input_types": [
- "Record",
- "Text"
- ],
- "dynamic": false,
- "info": "Text or Record to be passed as output.",
- "load_from_db": false,
- "title_case": false
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional\n\nfrom langflow.base.io.text import TextComponent\nfrom langflow.field_typing import Text\n\n\nclass TextOutput(TextComponent):\n display_name = \"Text Output\"\n description = \"Display a text output in the Playground.\"\n icon = \"type\"\n\n def build_config(self):\n return {\n \"input_value\": {\n \"display_name\": \"Value\",\n \"input_types\": [\"Record\", \"Text\"],\n \"info\": \"Text or Record to be passed as output.\",\n },\n \"record_template\": {\n \"display_name\": \"Record Template\",\n \"multiline\": True,\n \"info\": \"Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.\",\n \"advanced\": True,\n },\n }\n\n def build(self, input_value: Optional[Text] = \"\", record_template: str = \"\") -> Text:\n return super().build(input_value=input_value, record_template=record_template)\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "record_template": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "{text}",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "record_template",
- "display_name": "Record Template",
- "advanced": true,
- "dynamic": false,
- "info": "Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Display a text output in the Playground.",
- "icon": "type",
- "base_classes": [
- "object",
- "Text",
- "str"
- ],
- "display_name": "Extracted Chunks",
- "documentation": "",
- "custom_fields": {
- "input_value": null,
- "record_template": null
- },
- "output_types": [
- "Text"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "TextOutput-BDknO"
- },
- "selected": false,
- "width": 384,
- "height": 289,
- "positionAbsolute": {
- "x": 2322.600672827879,
- "y": 604.9467307442569
- },
- "dragging": false
+ "description": "Get chat inputs from the Playground.",
+ "icon": "ChatInput",
+ "base_classes": ["Text", "str", "object", "Record"],
+ "display_name": "Chat Input",
+ "documentation": "",
+ "custom_fields": {
+ "sender": null,
+ "sender_name": null,
+ "input_value": null,
+ "session_id": null,
+ "return_record": null
},
- {
- "id": "OpenAIEmbeddings-ZlOk1",
- "type": "genericNode",
- "position": {
- "x": 1183.667250865064,
- "y": 687.3171828430261
- },
- "data": {
- "type": "OpenAIEmbeddings",
- "node": {
- "template": {
- "allowed_special": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": [],
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "allowed_special",
- "display_name": "Allowed Special",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "chunk_size": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 1000,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "chunk_size",
- "display_name": "Chunk Size",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "client": {
- "type": "Any",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "client",
- "display_name": "Client",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Any, Dict, List, Optional\n\nfrom langchain_openai.embeddings.base import OpenAIEmbeddings\n\nfrom langflow.field_typing import Embeddings, NestedDict\nfrom langflow.interface.custom.custom_component import CustomComponent\n\n\nclass OpenAIEmbeddingsComponent(CustomComponent):\n display_name = \"OpenAI Embeddings\"\n description = \"Generate embeddings using OpenAI models.\"\n\n def build_config(self):\n return {\n \"allowed_special\": {\n \"display_name\": \"Allowed Special\",\n \"advanced\": True,\n \"field_type\": \"str\",\n \"is_list\": True,\n },\n \"default_headers\": {\n \"display_name\": \"Default Headers\",\n \"advanced\": True,\n \"field_type\": \"dict\",\n },\n \"default_query\": {\n \"display_name\": \"Default Query\",\n \"advanced\": True,\n \"field_type\": \"NestedDict\",\n },\n \"disallowed_special\": {\n \"display_name\": \"Disallowed Special\",\n \"advanced\": True,\n \"field_type\": \"str\",\n \"is_list\": True,\n },\n \"chunk_size\": {\"display_name\": \"Chunk Size\", \"advanced\": True},\n \"client\": {\"display_name\": \"Client\", \"advanced\": True},\n \"deployment\": {\"display_name\": \"Deployment\", \"advanced\": True},\n \"embedding_ctx_length\": {\n \"display_name\": \"Embedding Context Length\",\n \"advanced\": True,\n },\n \"max_retries\": {\"display_name\": \"Max Retries\", \"advanced\": True},\n \"model\": {\n \"display_name\": \"Model\",\n \"advanced\": False,\n \"options\": [\n \"text-embedding-3-small\",\n \"text-embedding-3-large\",\n \"text-embedding-ada-002\",\n ],\n },\n \"model_kwargs\": {\"display_name\": \"Model Kwargs\", \"advanced\": True},\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"password\": True,\n \"advanced\": True,\n },\n \"openai_api_key\": {\"display_name\": \"OpenAI API Key\", \"password\": True},\n \"openai_api_type\": {\n \"display_name\": \"OpenAI API Type\",\n \"advanced\": True,\n \"password\": True,\n },\n \"openai_api_version\": {\n \"display_name\": \"OpenAI API Version\",\n \"advanced\": True,\n },\n \"openai_organization\": {\n \"display_name\": \"OpenAI Organization\",\n \"advanced\": True,\n },\n \"openai_proxy\": {\"display_name\": \"OpenAI Proxy\", \"advanced\": True},\n \"request_timeout\": {\"display_name\": \"Request Timeout\", \"advanced\": True},\n \"show_progress_bar\": {\n \"display_name\": \"Show Progress Bar\",\n \"advanced\": True,\n },\n \"skip_empty\": {\"display_name\": \"Skip Empty\", \"advanced\": True},\n \"tiktoken_model_name\": {\n \"display_name\": \"TikToken Model Name\",\n \"advanced\": True,\n },\n \"tiktoken_enable\": {\"display_name\": \"TikToken Enable\", \"advanced\": True},\n }\n\n def build(\n self,\n openai_api_key: str,\n default_headers: Optional[Dict[str, str]] = None,\n default_query: Optional[NestedDict] = {},\n allowed_special: List[str] = [],\n disallowed_special: List[str] = [\"all\"],\n chunk_size: int = 1000,\n client: Optional[Any] = None,\n deployment: str = \"text-embedding-ada-002\",\n embedding_ctx_length: int = 8191,\n max_retries: int = 6,\n model: str = \"text-embedding-ada-002\",\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n openai_api_type: Optional[str] = None,\n openai_api_version: Optional[str] = None,\n openai_organization: Optional[str] = None,\n openai_proxy: Optional[str] = None,\n request_timeout: Optional[float] = None,\n show_progress_bar: bool = False,\n skip_empty: bool = False,\n tiktoken_enable: bool = True,\n tiktoken_model_name: Optional[str] = None,\n ) -> Embeddings:\n # This is to avoid errors with Vector Stores (e.g Chroma)\n if disallowed_special == [\"all\"]:\n disallowed_special = \"all\" # type: ignore\n\n return OpenAIEmbeddings(\n tiktoken_enabled=tiktoken_enable,\n default_headers=default_headers,\n default_query=default_query,\n allowed_special=set(allowed_special),\n disallowed_special=\"all\",\n chunk_size=chunk_size,\n client=client,\n deployment=deployment,\n embedding_ctx_length=embedding_ctx_length,\n max_retries=max_retries,\n model=model,\n model_kwargs=model_kwargs,\n base_url=openai_api_base,\n api_key=openai_api_key,\n openai_api_type=openai_api_type,\n api_version=openai_api_version,\n organization=openai_organization,\n openai_proxy=openai_proxy,\n timeout=request_timeout,\n show_progress_bar=show_progress_bar,\n skip_empty=skip_empty,\n tiktoken_model_name=tiktoken_model_name,\n )\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "default_headers": {
- "type": "dict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "default_headers",
- "display_name": "Default Headers",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "default_query": {
- "type": "NestedDict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": {},
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "default_query",
- "display_name": "Default Query",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "deployment": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "text-embedding-ada-002",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "deployment",
- "display_name": "Deployment",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "disallowed_special": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": [
- "all"
- ],
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "disallowed_special",
- "display_name": "Disallowed Special",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "embedding_ctx_length": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 8191,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "embedding_ctx_length",
- "display_name": "Embedding Context Length",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "max_retries": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 6,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "max_retries",
- "display_name": "Max Retries",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "model": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "text-embedding-ada-002",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "text-embedding-3-small",
- "text-embedding-3-large",
- "text-embedding-ada-002"
- ],
- "name": "model",
- "display_name": "Model",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "model_kwargs": {
- "type": "NestedDict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": {},
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "model_kwargs",
- "display_name": "Model Kwargs",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "openai_api_base": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "openai_api_base",
- "display_name": "OpenAI API Base",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_key": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "openai_api_key",
- "display_name": "OpenAI API Key",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": true,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "OPENAI_API_KEY"
- },
- "openai_api_type": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "openai_api_type",
- "display_name": "OpenAI API Type",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_version": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "openai_api_version",
- "display_name": "OpenAI API Version",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_organization": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "openai_organization",
- "display_name": "OpenAI Organization",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_proxy": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "openai_proxy",
- "display_name": "OpenAI Proxy",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "request_timeout": {
- "type": "float",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "request_timeout",
- "display_name": "Request Timeout",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "rangeSpec": {
- "step_type": "float",
- "min": -1,
- "max": 1,
- "step": 0.1
- },
- "load_from_db": false,
- "title_case": false
- },
- "show_progress_bar": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "show_progress_bar",
- "display_name": "Show Progress Bar",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "skip_empty": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "skip_empty",
- "display_name": "Skip Empty",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "tiktoken_enable": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": true,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "tiktoken_enable",
- "display_name": "TikToken Enable",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "tiktoken_model_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "tiktoken_model_name",
- "display_name": "TikToken Model Name",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Generate embeddings using OpenAI models.",
- "base_classes": [
- "Embeddings"
- ],
- "display_name": "OpenAI Embeddings",
- "documentation": "",
- "custom_fields": {
- "openai_api_key": null,
- "default_headers": null,
- "default_query": null,
- "allowed_special": null,
- "disallowed_special": null,
- "chunk_size": null,
- "client": null,
- "deployment": null,
- "embedding_ctx_length": null,
- "max_retries": null,
- "model": null,
- "model_kwargs": null,
- "openai_api_base": null,
- "openai_api_type": null,
- "openai_api_version": null,
- "openai_organization": null,
- "openai_proxy": null,
- "request_timeout": null,
- "show_progress_bar": null,
- "skip_empty": null,
- "tiktoken_enable": null,
- "tiktoken_model_name": null
- },
- "output_types": [
- "Embeddings"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "OpenAIEmbeddings-ZlOk1"
- },
- "selected": false,
- "width": 384,
- "height": 383,
- "dragging": false
+ "output_types": ["Text", "Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "ChatInput-yxMKE"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 383
+ },
+ {
+ "id": "TextOutput-BDknO",
+ "type": "genericNode",
+ "position": {
+ "x": 2322.600672827879,
+ "y": 604.9467307442569
+ },
+ "data": {
+ "type": "TextOutput",
+ "node": {
+ "template": {
+ "input_value": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Value",
+ "advanced": false,
+ "input_types": ["Record", "Text"],
+ "dynamic": false,
+ "info": "Text or Record to be passed as output.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional\n\nfrom langflow.base.io.text import TextComponent\nfrom langflow.field_typing import Text\n\n\nclass TextOutput(TextComponent):\n display_name = \"Text Output\"\n description = \"Display a text output in the Playground.\"\n icon = \"type\"\n\n def build_config(self):\n return {\n \"input_value\": {\n \"display_name\": \"Value\",\n \"input_types\": [\"Record\", \"Text\"],\n \"info\": \"Text or Record to be passed as output.\",\n },\n \"record_template\": {\n \"display_name\": \"Record Template\",\n \"multiline\": True,\n \"info\": \"Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.\",\n \"advanced\": True,\n },\n }\n\n def build(self, input_value: Optional[Text] = \"\", record_template: str = \"\") -> Text:\n return super().build(input_value=input_value, record_template=record_template)\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "record_template": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "{text}",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "record_template",
+ "display_name": "Record Template",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
},
- {
- "id": "OpenAIModel-EjXlN",
- "type": "genericNode",
- "position": {
- "x": 3410.117202077183,
- "y": 431.2038048137648
- },
- "data": {
- "type": "OpenAIModel",
- "node": {
- "template": {
- "input_value": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Input",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional\n\nfrom langchain_openai import ChatOpenAI\n\nfrom langflow.base.constants import STREAM_INFO_TEXT\nfrom langflow.base.models.model import LCModelComponent\nfrom langflow.field_typing import NestedDict, Text\n\n\nclass OpenAIModelComponent(LCModelComponent):\n display_name = \"OpenAI\"\n description = \"Generates text using OpenAI LLMs.\"\n icon = \"OpenAI\"\n\n field_order = [\n \"max_tokens\",\n \"model_kwargs\",\n \"model_name\",\n \"openai_api_base\",\n \"openai_api_key\",\n \"temperature\",\n \"input_value\",\n \"system_message\",\n \"stream\",\n ]\n\n def build_config(self):\n return {\n \"input_value\": {\"display_name\": \"Input\"},\n \"max_tokens\": {\n \"display_name\": \"Max Tokens\",\n \"advanced\": True,\n },\n \"model_kwargs\": {\n \"display_name\": \"Model Kwargs\",\n \"advanced\": True,\n },\n \"model_name\": {\n \"display_name\": \"Model Name\",\n \"advanced\": False,\n \"options\": [\n \"gpt-4-turbo-preview\",\n \"gpt-3.5-turbo\",\n \"gpt-4-0125-preview\",\n \"gpt-4-1106-preview\",\n \"gpt-4-vision-preview\",\n \"gpt-3.5-turbo-0125\",\n \"gpt-3.5-turbo-1106\",\n ],\n \"value\": \"gpt-4-turbo-preview\",\n },\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"advanced\": True,\n \"info\": (\n \"The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\\n\\n\"\n \"You can change this to use other APIs like JinaChat, LocalAI and Prem.\"\n ),\n },\n \"openai_api_key\": {\n \"display_name\": \"OpenAI API Key\",\n \"info\": \"The OpenAI API Key to use for the OpenAI model.\",\n \"advanced\": False,\n \"password\": True,\n },\n \"temperature\": {\n \"display_name\": \"Temperature\",\n \"advanced\": False,\n \"value\": 0.1,\n },\n \"stream\": {\n \"display_name\": \"Stream\",\n \"info\": STREAM_INFO_TEXT,\n \"advanced\": True,\n },\n \"system_message\": {\n \"display_name\": \"System Message\",\n \"info\": \"System message to pass to the model.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n input_value: Text,\n openai_api_key: str,\n temperature: float,\n model_name: str,\n max_tokens: Optional[int] = 256,\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n stream: bool = False,\n system_message: Optional[str] = None,\n ) -> Text:\n if not openai_api_base:\n openai_api_base = \"https://api.openai.com/v1\"\n output = ChatOpenAI(\n max_tokens=max_tokens,\n model_kwargs=model_kwargs,\n model=model_name,\n base_url=openai_api_base,\n api_key=openai_api_key,\n temperature=temperature,\n )\n\n return self.get_chat_result(output, stream, input_value, system_message)\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "max_tokens": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 256,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "max_tokens",
- "display_name": "Max Tokens",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "model_kwargs": {
- "type": "NestedDict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": {},
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "model_kwargs",
- "display_name": "Model Kwargs",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "model_name": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "gpt-3.5-turbo",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "gpt-4-turbo-preview",
- "gpt-3.5-turbo",
- "gpt-4-0125-preview",
- "gpt-4-1106-preview",
- "gpt-4-vision-preview",
- "gpt-3.5-turbo-0125",
- "gpt-3.5-turbo-1106"
- ],
- "name": "model_name",
- "display_name": "Model Name",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_base": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "openai_api_base",
- "display_name": "OpenAI API Base",
- "advanced": true,
- "dynamic": false,
- "info": "The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\n\nYou can change this to use other APIs like JinaChat, LocalAI and Prem.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_key": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "openai_api_key",
- "display_name": "OpenAI API Key",
- "advanced": false,
- "dynamic": false,
- "info": "The OpenAI API Key to use for the OpenAI model.",
- "load_from_db": true,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "OPENAI_API_KEY"
- },
- "stream": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "stream",
- "display_name": "Stream",
- "advanced": true,
- "dynamic": false,
- "info": "Stream the response from the model. Streaming works only in Chat.",
- "load_from_db": false,
- "title_case": false
- },
- "system_message": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "system_message",
- "display_name": "System Message",
- "advanced": true,
- "dynamic": false,
- "info": "System message to pass to the model.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "temperature": {
- "type": "float",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 0.1,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "temperature",
- "display_name": "Temperature",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "rangeSpec": {
- "step_type": "float",
- "min": -1,
- "max": 1,
- "step": 0.1
- },
- "load_from_db": false,
- "title_case": false
- },
- "_type": "CustomComponent"
- },
- "description": "Generates text using OpenAI LLMs.",
- "icon": "OpenAI",
- "base_classes": [
- "object",
- "Text",
- "str"
- ],
- "display_name": "OpenAI",
- "documentation": "",
- "custom_fields": {
- "input_value": null,
- "openai_api_key": null,
- "temperature": null,
- "model_name": null,
- "max_tokens": null,
- "model_kwargs": null,
- "openai_api_base": null,
- "stream": null,
- "system_message": null
- },
- "output_types": [
- "Text"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [
- "max_tokens",
- "model_kwargs",
- "model_name",
- "openai_api_base",
- "openai_api_key",
- "temperature",
- "input_value",
- "system_message",
- "stream"
- ],
- "beta": false
- },
- "id": "OpenAIModel-EjXlN"
- },
- "selected": true,
- "width": 384,
- "height": 563,
- "positionAbsolute": {
- "x": 3410.117202077183,
- "y": 431.2038048137648
- },
- "dragging": false
+ "description": "Display a text output in the Playground.",
+ "icon": "type",
+ "base_classes": ["object", "Text", "str"],
+ "display_name": "Extracted Chunks",
+ "documentation": "",
+ "custom_fields": {
+ "input_value": null,
+ "record_template": null
},
- {
- "id": "Prompt-xeI6K",
- "type": "genericNode",
- "position": {
- "x": 2969.0261961391298,
- "y": 442.1613649809069
+ "output_types": ["Text"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "TextOutput-BDknO"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 289,
+ "positionAbsolute": {
+ "x": 2322.600672827879,
+ "y": 604.9467307442569
+ },
+ "dragging": false
+ },
+ {
+ "id": "OpenAIEmbeddings-ZlOk1",
+ "type": "genericNode",
+ "position": {
+ "x": 1183.667250865064,
+ "y": 687.3171828430261
+ },
+ "data": {
+ "type": "OpenAIEmbeddings",
+ "node": {
+ "template": {
+ "allowed_special": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": [],
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "allowed_special",
+ "display_name": "Allowed Special",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "chunk_size": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 1000,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "chunk_size",
+ "display_name": "Chunk Size",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "client": {
+ "type": "Any",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "client",
+ "display_name": "Client",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Any, Dict, List, Optional\n\nfrom langchain_openai.embeddings.base import OpenAIEmbeddings\n\nfrom langflow.field_typing import Embeddings, NestedDict\nfrom langflow.interface.custom.custom_component import CustomComponent\n\n\nclass OpenAIEmbeddingsComponent(CustomComponent):\n display_name = \"OpenAI Embeddings\"\n description = \"Generate embeddings using OpenAI models.\"\n\n def build_config(self):\n return {\n \"allowed_special\": {\n \"display_name\": \"Allowed Special\",\n \"advanced\": True,\n \"field_type\": \"str\",\n \"is_list\": True,\n },\n \"default_headers\": {\n \"display_name\": \"Default Headers\",\n \"advanced\": True,\n \"field_type\": \"dict\",\n },\n \"default_query\": {\n \"display_name\": \"Default Query\",\n \"advanced\": True,\n \"field_type\": \"NestedDict\",\n },\n \"disallowed_special\": {\n \"display_name\": \"Disallowed Special\",\n \"advanced\": True,\n \"field_type\": \"str\",\n \"is_list\": True,\n },\n \"chunk_size\": {\"display_name\": \"Chunk Size\", \"advanced\": True},\n \"client\": {\"display_name\": \"Client\", \"advanced\": True},\n \"deployment\": {\"display_name\": \"Deployment\", \"advanced\": True},\n \"embedding_ctx_length\": {\n \"display_name\": \"Embedding Context Length\",\n \"advanced\": True,\n },\n \"max_retries\": {\"display_name\": \"Max Retries\", \"advanced\": True},\n \"model\": {\n \"display_name\": \"Model\",\n \"advanced\": False,\n \"options\": [\n \"text-embedding-3-small\",\n \"text-embedding-3-large\",\n \"text-embedding-ada-002\",\n ],\n },\n \"model_kwargs\": {\"display_name\": \"Model Kwargs\", \"advanced\": True},\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"password\": True,\n \"advanced\": True,\n },\n \"openai_api_key\": {\"display_name\": \"OpenAI API Key\", \"password\": True},\n \"openai_api_type\": {\n \"display_name\": \"OpenAI API Type\",\n \"advanced\": True,\n \"password\": True,\n },\n \"openai_api_version\": {\n \"display_name\": \"OpenAI API Version\",\n \"advanced\": True,\n },\n \"openai_organization\": {\n \"display_name\": \"OpenAI Organization\",\n \"advanced\": True,\n },\n \"openai_proxy\": {\"display_name\": \"OpenAI Proxy\", \"advanced\": True},\n \"request_timeout\": {\"display_name\": \"Request Timeout\", \"advanced\": True},\n \"show_progress_bar\": {\n \"display_name\": \"Show Progress Bar\",\n \"advanced\": True,\n },\n \"skip_empty\": {\"display_name\": \"Skip Empty\", \"advanced\": True},\n \"tiktoken_model_name\": {\n \"display_name\": \"TikToken Model Name\",\n \"advanced\": True,\n },\n \"tiktoken_enable\": {\"display_name\": \"TikToken Enable\", \"advanced\": True},\n }\n\n def build(\n self,\n openai_api_key: str,\n default_headers: Optional[Dict[str, str]] = None,\n default_query: Optional[NestedDict] = {},\n allowed_special: List[str] = [],\n disallowed_special: List[str] = [\"all\"],\n chunk_size: int = 1000,\n client: Optional[Any] = None,\n deployment: str = \"text-embedding-ada-002\",\n embedding_ctx_length: int = 8191,\n max_retries: int = 6,\n model: str = \"text-embedding-ada-002\",\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n openai_api_type: Optional[str] = None,\n openai_api_version: Optional[str] = None,\n openai_organization: Optional[str] = None,\n openai_proxy: Optional[str] = None,\n request_timeout: Optional[float] = None,\n show_progress_bar: bool = False,\n skip_empty: bool = False,\n tiktoken_enable: bool = True,\n tiktoken_model_name: Optional[str] = None,\n ) -> Embeddings:\n # This is to avoid errors with Vector Stores (e.g Chroma)\n if disallowed_special == [\"all\"]:\n disallowed_special = \"all\" # type: ignore\n\n return OpenAIEmbeddings(\n tiktoken_enabled=tiktoken_enable,\n default_headers=default_headers,\n default_query=default_query,\n allowed_special=set(allowed_special),\n disallowed_special=\"all\",\n chunk_size=chunk_size,\n client=client,\n deployment=deployment,\n embedding_ctx_length=embedding_ctx_length,\n max_retries=max_retries,\n model=model,\n model_kwargs=model_kwargs,\n base_url=openai_api_base,\n api_key=openai_api_key,\n openai_api_type=openai_api_type,\n api_version=openai_api_version,\n organization=openai_organization,\n openai_proxy=openai_proxy,\n timeout=request_timeout,\n show_progress_bar=show_progress_bar,\n skip_empty=skip_empty,\n tiktoken_model_name=tiktoken_model_name,\n )\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "default_headers": {
+ "type": "dict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "default_headers",
+ "display_name": "Default Headers",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "default_query": {
+ "type": "NestedDict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": {},
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "default_query",
+ "display_name": "Default Query",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "deployment": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "text-embedding-ada-002",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "deployment",
+ "display_name": "Deployment",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "disallowed_special": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": ["all"],
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "disallowed_special",
+ "display_name": "Disallowed Special",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "embedding_ctx_length": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 8191,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "embedding_ctx_length",
+ "display_name": "Embedding Context Length",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "max_retries": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 6,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "max_retries",
+ "display_name": "Max Retries",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "model": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "text-embedding-ada-002",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": [
+ "text-embedding-3-small",
+ "text-embedding-3-large",
+ "text-embedding-ada-002"
+ ],
+ "name": "model",
+ "display_name": "Model",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "model_kwargs": {
+ "type": "NestedDict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": {},
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "model_kwargs",
+ "display_name": "Model Kwargs",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "openai_api_base": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "openai_api_base",
+ "display_name": "OpenAI API Base",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_key": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "openai_api_key",
+ "display_name": "OpenAI API Key",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "OPENAI_API_KEY"
+ },
+ "openai_api_type": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "openai_api_type",
+ "display_name": "OpenAI API Type",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_version": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "openai_api_version",
+ "display_name": "OpenAI API Version",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_organization": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "openai_organization",
+ "display_name": "OpenAI Organization",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_proxy": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "openai_proxy",
+ "display_name": "OpenAI Proxy",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "request_timeout": {
+ "type": "float",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "request_timeout",
+ "display_name": "Request Timeout",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "rangeSpec": {
+ "step_type": "float",
+ "min": -1,
+ "max": 1,
+ "step": 0.1
},
- "data": {
- "type": "Prompt",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from langchain_core.prompts import PromptTemplate\n\nfrom langflow.field_typing import Prompt, TemplateField, Text\nfrom langflow.interface.custom.custom_component import CustomComponent\n\n\nclass PromptComponent(CustomComponent):\n display_name: str = \"Prompt\"\n description: str = \"Create a prompt template with dynamic variables.\"\n icon = \"prompts\"\n\n def build_config(self):\n return {\n \"template\": TemplateField(display_name=\"Template\"),\n \"code\": TemplateField(advanced=True),\n }\n\n def build(\n self,\n template: Prompt,\n **kwargs,\n ) -> Text:\n from langflow.base.prompts.utils import dict_values_to_string\n\n prompt_template = PromptTemplate.from_template(Text(template))\n kwargs = dict_values_to_string(kwargs)\n kwargs = {k: \"\\n\".join(v) if isinstance(v, list) else v for k, v in kwargs.items()}\n try:\n formated_prompt = prompt_template.format(**kwargs)\n except Exception as exc:\n raise ValueError(f\"Error formatting prompt: {exc}\") from exc\n self.status = f'Prompt:\\n\"{formated_prompt}\"'\n return formated_prompt\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "template": {
- "type": "prompt",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "{context}\n\n---\n\nGiven the context above, answer the question as best as possible.\n\nQuestion: {question}\n\nAnswer: ",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "template",
- "display_name": "Template",
- "advanced": false,
- "input_types": [
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "_type": "CustomComponent",
- "context": {
- "field_type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "context",
- "display_name": "context",
- "advanced": false,
- "input_types": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "type": "str"
- },
- "question": {
- "field_type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "question",
- "display_name": "question",
- "advanced": false,
- "input_types": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "type": "str"
- }
- },
- "description": "Create a prompt template with dynamic variables.",
- "icon": "prompts",
- "is_input": null,
- "is_output": null,
- "is_composition": null,
- "base_classes": [
- "object",
- "Text",
- "str"
- ],
- "name": "",
- "display_name": "Prompt",
- "documentation": "",
- "custom_fields": {
- "template": [
- "context",
- "question"
- ]
- },
- "output_types": [
- "Text"
- ],
- "full_path": null,
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false,
- "error": null
- },
- "id": "Prompt-xeI6K",
- "description": "Create a prompt template with dynamic variables.",
- "display_name": "Prompt"
- },
- "selected": false,
- "width": 384,
- "height": 477,
- "positionAbsolute": {
- "x": 2969.0261961391298,
- "y": 442.1613649809069
- },
- "dragging": false
+ "load_from_db": false,
+ "title_case": false
+ },
+ "show_progress_bar": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "show_progress_bar",
+ "display_name": "Show Progress Bar",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "skip_empty": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "skip_empty",
+ "display_name": "Skip Empty",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "tiktoken_enable": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "tiktoken_enable",
+ "display_name": "TikToken Enable",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "tiktoken_model_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "tiktoken_model_name",
+ "display_name": "TikToken Model Name",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
},
- {
- "id": "ChatOutput-Q39I8",
- "type": "genericNode",
- "position": {
- "x": 3887.2073667611485,
- "y": 588.4801225794856
- },
- "data": {
- "type": "ChatOutput",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n\n def build(\n self,\n sender: Optional[str] = \"Machine\",\n sender_name: Optional[str] = \"AI\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n record_template: Optional[str] = \"{text}\",\n ) -> Union[Text, Record]:\n return super().build(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n record_template=record_template,\n )\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "input_value": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Message",
- "advanced": false,
- "input_types": [
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "record_template": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "{text}",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "record_template",
- "display_name": "Record Template",
- "advanced": true,
- "dynamic": false,
- "info": "In case of Message being a Record, this template will be used to convert it to text.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "return_record": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "return_record",
- "display_name": "Return Record",
- "advanced": true,
- "dynamic": false,
- "info": "Return the message as a record containing the sender, sender_name, and session_id.",
- "load_from_db": false,
- "title_case": false
- },
- "sender": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "Machine",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "Machine",
- "User"
- ],
- "name": "sender",
- "display_name": "Sender Type",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "sender_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "AI",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "sender_name",
- "display_name": "Sender Name",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "session_id": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "session_id",
- "display_name": "Session ID",
- "advanced": true,
- "dynamic": false,
- "info": "If provided, the message will be stored in the memory.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Display a chat message in the Playground.",
- "icon": "ChatOutput",
- "base_classes": [
- "object",
- "Text",
- "Record",
- "str"
- ],
- "display_name": "Chat Output",
- "documentation": "",
- "custom_fields": {
- "sender": null,
- "sender_name": null,
- "input_value": null,
- "session_id": null,
- "return_record": null,
- "record_template": null
- },
- "output_types": [
- "Text",
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "ChatOutput-Q39I8"
- },
- "selected": false,
- "width": 384,
- "height": 383,
- "positionAbsolute": {
- "x": 3887.2073667611485,
- "y": 588.4801225794856
- },
- "dragging": false
+ "description": "Generate embeddings using OpenAI models.",
+ "base_classes": ["Embeddings"],
+ "display_name": "OpenAI Embeddings",
+ "documentation": "",
+ "custom_fields": {
+ "openai_api_key": null,
+ "default_headers": null,
+ "default_query": null,
+ "allowed_special": null,
+ "disallowed_special": null,
+ "chunk_size": null,
+ "client": null,
+ "deployment": null,
+ "embedding_ctx_length": null,
+ "max_retries": null,
+ "model": null,
+ "model_kwargs": null,
+ "openai_api_base": null,
+ "openai_api_type": null,
+ "openai_api_version": null,
+ "openai_organization": null,
+ "openai_proxy": null,
+ "request_timeout": null,
+ "show_progress_bar": null,
+ "skip_empty": null,
+ "tiktoken_enable": null,
+ "tiktoken_model_name": null
},
- {
- "id": "File-t0a6a",
- "type": "genericNode",
- "position": {
- "x": 2257.233450682836,
- "y": 1747.5389618367233
+ "output_types": ["Embeddings"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "OpenAIEmbeddings-ZlOk1"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 383,
+ "dragging": false
+ },
+ {
+ "id": "OpenAIModel-EjXlN",
+ "type": "genericNode",
+ "position": {
+ "x": 3410.117202077183,
+ "y": 431.2038048137648
+ },
+ "data": {
+ "type": "OpenAIModel",
+ "node": {
+ "template": {
+ "input_value": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Input",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional\n\nfrom langchain_openai import ChatOpenAI\n\nfrom langflow.base.constants import STREAM_INFO_TEXT\nfrom langflow.base.models.model import LCModelComponent\nfrom langflow.field_typing import NestedDict, Text\n\n\nclass OpenAIModelComponent(LCModelComponent):\n display_name = \"OpenAI\"\n description = \"Generates text using OpenAI LLMs.\"\n icon = \"OpenAI\"\n\n field_order = [\n \"max_tokens\",\n \"model_kwargs\",\n \"model_name\",\n \"openai_api_base\",\n \"openai_api_key\",\n \"temperature\",\n \"input_value\",\n \"system_message\",\n \"stream\",\n ]\n\n def build_config(self):\n return {\n \"input_value\": {\"display_name\": \"Input\"},\n \"max_tokens\": {\n \"display_name\": \"Max Tokens\",\n \"advanced\": True,\n },\n \"model_kwargs\": {\n \"display_name\": \"Model Kwargs\",\n \"advanced\": True,\n },\n \"model_name\": {\n \"display_name\": \"Model Name\",\n \"advanced\": False,\n \"options\": [\n \"gpt-4-turbo-preview\",\n \"gpt-3.5-turbo\",\n \"gpt-4-0125-preview\",\n \"gpt-4-1106-preview\",\n \"gpt-4-vision-preview\",\n \"gpt-3.5-turbo-0125\",\n \"gpt-3.5-turbo-1106\",\n ],\n \"value\": \"gpt-4-turbo-preview\",\n },\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"advanced\": True,\n \"info\": (\n \"The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\\n\\n\"\n \"You can change this to use other APIs like JinaChat, LocalAI and Prem.\"\n ),\n },\n \"openai_api_key\": {\n \"display_name\": \"OpenAI API Key\",\n \"info\": \"The OpenAI API Key to use for the OpenAI model.\",\n \"advanced\": False,\n \"password\": True,\n },\n \"temperature\": {\n \"display_name\": \"Temperature\",\n \"advanced\": False,\n \"value\": 0.1,\n },\n \"stream\": {\n \"display_name\": \"Stream\",\n \"info\": STREAM_INFO_TEXT,\n \"advanced\": True,\n },\n \"system_message\": {\n \"display_name\": \"System Message\",\n \"info\": \"System message to pass to the model.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n input_value: Text,\n openai_api_key: str,\n temperature: float,\n model_name: str,\n max_tokens: Optional[int] = 256,\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n stream: bool = False,\n system_message: Optional[str] = None,\n ) -> Text:\n if not openai_api_base:\n openai_api_base = \"https://api.openai.com/v1\"\n output = ChatOpenAI(\n max_tokens=max_tokens,\n model_kwargs=model_kwargs,\n model=model_name,\n base_url=openai_api_base,\n api_key=openai_api_key,\n temperature=temperature,\n )\n\n return self.get_chat_result(output, stream, input_value, system_message)\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "max_tokens": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 256,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "max_tokens",
+ "display_name": "Max Tokens",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "model_kwargs": {
+ "type": "NestedDict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": {},
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "model_kwargs",
+ "display_name": "Model Kwargs",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "model_name": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "gpt-3.5-turbo",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": [
+ "gpt-4-turbo-preview",
+ "gpt-3.5-turbo",
+ "gpt-4-0125-preview",
+ "gpt-4-1106-preview",
+ "gpt-4-vision-preview",
+ "gpt-3.5-turbo-0125",
+ "gpt-3.5-turbo-1106"
+ ],
+ "name": "model_name",
+ "display_name": "Model Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_base": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "openai_api_base",
+ "display_name": "OpenAI API Base",
+ "advanced": true,
+ "dynamic": false,
+ "info": "The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\n\nYou can change this to use other APIs like JinaChat, LocalAI and Prem.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_key": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "openai_api_key",
+ "display_name": "OpenAI API Key",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The OpenAI API Key to use for the OpenAI model.",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "OPENAI_API_KEY"
+ },
+ "stream": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "stream",
+ "display_name": "Stream",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Stream the response from the model. Streaming works only in Chat.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "system_message": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "system_message",
+ "display_name": "System Message",
+ "advanced": true,
+ "dynamic": false,
+ "info": "System message to pass to the model.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "temperature": {
+ "type": "float",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 0.1,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "temperature",
+ "display_name": "Temperature",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "rangeSpec": {
+ "step_type": "float",
+ "min": -1,
+ "max": 1,
+ "step": 0.1
},
- "data": {
- "type": "File",
- "node": {
- "template": {
- "path": {
- "type": "file",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [
- ".txt",
- ".md",
- ".mdx",
- ".csv",
- ".json",
- ".yaml",
- ".yml",
- ".xml",
- ".html",
- ".htm",
- ".pdf",
- ".docx"
- ],
- "file_path": "51e2b78a-199b-4054-9f32-e288eef6924c/Langflow conversation.pdf",
- "password": false,
- "name": "path",
- "display_name": "Path",
- "advanced": false,
- "dynamic": false,
- "info": "Supported file types: txt, md, mdx, csv, json, yaml, yml, xml, html, htm, pdf, docx",
- "load_from_db": false,
- "title_case": false,
- "value": ""
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from pathlib import Path\nfrom typing import Any, Dict\n\nfrom langflow.base.data.utils import TEXT_FILE_TYPES, parse_text_file_to_record\nfrom langflow.interface.custom.custom_component import CustomComponent\nfrom langflow.schema import Record\n\n\nclass FileComponent(CustomComponent):\n display_name = \"File\"\n description = \"A generic file loader.\"\n icon = \"file-text\"\n\n def build_config(self) -> Dict[str, Any]:\n return {\n \"path\": {\n \"display_name\": \"Path\",\n \"field_type\": \"file\",\n \"file_types\": TEXT_FILE_TYPES,\n \"info\": f\"Supported file types: {', '.join(TEXT_FILE_TYPES)}\",\n },\n \"silent_errors\": {\n \"display_name\": \"Silent Errors\",\n \"advanced\": True,\n \"info\": \"If true, errors will not raise an exception.\",\n },\n }\n\n def load_file(self, path: str, silent_errors: bool = False) -> Record:\n resolved_path = self.resolve_path(path)\n path_obj = Path(resolved_path)\n extension = path_obj.suffix[1:].lower()\n if extension == \"doc\":\n raise ValueError(\"doc files are not supported. Please save as .docx\")\n if extension not in TEXT_FILE_TYPES:\n raise ValueError(f\"Unsupported file type: {extension}\")\n record = parse_text_file_to_record(resolved_path, silent_errors)\n self.status = record if record else \"No data\"\n return record or Record()\n\n def build(\n self,\n path: str,\n silent_errors: bool = False,\n ) -> Record:\n record = self.load_file(path, silent_errors)\n self.status = record\n return record\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "silent_errors": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "silent_errors",
- "display_name": "Silent Errors",
- "advanced": true,
- "dynamic": false,
- "info": "If true, errors will not raise an exception.",
- "load_from_db": false,
- "title_case": false
- },
- "_type": "CustomComponent"
- },
- "description": "A generic file loader.",
- "icon": "file-text",
- "base_classes": [
- "Record"
- ],
- "display_name": "File",
- "documentation": "",
- "custom_fields": {
- "path": null,
- "silent_errors": null
- },
- "output_types": [
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "File-t0a6a"
- },
- "selected": false,
- "width": 384,
- "height": 281,
- "positionAbsolute": {
- "x": 2257.233450682836,
- "y": 1747.5389618367233
- },
- "dragging": false
+ "load_from_db": false,
+ "title_case": false
+ },
+ "_type": "CustomComponent"
},
- {
- "id": "RecursiveCharacterTextSplitter-tR9QM",
- "type": "genericNode",
- "position": {
- "x": 2791.013514133929,
- "y": 1462.9588953494142
- },
- "data": {
- "type": "RecursiveCharacterTextSplitter",
- "node": {
- "template": {
- "inputs": {
- "type": "Document",
- "required": true,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "inputs",
- "display_name": "Input",
- "advanced": false,
- "input_types": [
- "Document",
- "Record"
- ],
- "dynamic": false,
- "info": "The texts to split.",
- "load_from_db": false,
- "title_case": false
- },
- "chunk_overlap": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 200,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "chunk_overlap",
- "display_name": "Chunk Overlap",
- "advanced": false,
- "dynamic": false,
- "info": "The amount of overlap between chunks.",
- "load_from_db": false,
- "title_case": false
- },
- "chunk_size": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 1000,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "chunk_size",
- "display_name": "Chunk Size",
- "advanced": false,
- "dynamic": false,
- "info": "The maximum length of each chunk.",
- "load_from_db": false,
- "title_case": false
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional\n\nfrom langchain.text_splitter import RecursiveCharacterTextSplitter\nfrom langchain_core.documents import Document\n\nfrom langflow.interface.custom.custom_component import CustomComponent\nfrom langflow.schema import Record\nfrom langflow.utils.util import build_loader_repr_from_records, unescape_string\n\n\nclass RecursiveCharacterTextSplitterComponent(CustomComponent):\n display_name: str = \"Recursive Character Text Splitter\"\n description: str = \"Split text into chunks of a specified length.\"\n documentation: str = \"https://docs.langflow.org/components/text-splitters#recursivecharactertextsplitter\"\n\n def build_config(self):\n return {\n \"inputs\": {\n \"display_name\": \"Input\",\n \"info\": \"The texts to split.\",\n \"input_types\": [\"Document\", \"Record\"],\n },\n \"separators\": {\n \"display_name\": \"Separators\",\n \"info\": 'The characters to split on.\\nIf left empty defaults to [\"\\\\n\\\\n\", \"\\\\n\", \" \", \"\"].',\n \"is_list\": True,\n },\n \"chunk_size\": {\n \"display_name\": \"Chunk Size\",\n \"info\": \"The maximum length of each chunk.\",\n \"field_type\": \"int\",\n \"value\": 1000,\n },\n \"chunk_overlap\": {\n \"display_name\": \"Chunk Overlap\",\n \"info\": \"The amount of overlap between chunks.\",\n \"field_type\": \"int\",\n \"value\": 200,\n },\n \"code\": {\"show\": False},\n }\n\n def build(\n self,\n inputs: list[Document],\n separators: Optional[list[str]] = None,\n chunk_size: Optional[int] = 1000,\n chunk_overlap: Optional[int] = 200,\n ) -> list[Record]:\n \"\"\"\n Split text into chunks of a specified length.\n\n Args:\n separators (list[str]): The characters to split on.\n chunk_size (int): The maximum length of each chunk.\n chunk_overlap (int): The amount of overlap between chunks.\n length_function (function): The function to use to calculate the length of the text.\n\n Returns:\n list[str]: The chunks of text.\n \"\"\"\n\n if separators == \"\":\n separators = None\n elif separators:\n # check if the separators list has escaped characters\n # if there are escaped characters, unescape them\n separators = [unescape_string(x) for x in separators]\n\n # Make sure chunk_size and chunk_overlap are ints\n if isinstance(chunk_size, str):\n chunk_size = int(chunk_size)\n if isinstance(chunk_overlap, str):\n chunk_overlap = int(chunk_overlap)\n splitter = RecursiveCharacterTextSplitter(\n separators=separators,\n chunk_size=chunk_size,\n chunk_overlap=chunk_overlap,\n )\n documents = []\n for _input in inputs:\n if isinstance(_input, Record):\n documents.append(_input.to_lc_document())\n else:\n documents.append(_input)\n docs = splitter.split_documents(documents)\n records = self.to_records(docs)\n self.repr_value = build_loader_repr_from_records(records)\n return records\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "separators": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "separators",
- "display_name": "Separators",
- "advanced": false,
- "dynamic": false,
- "info": "The characters to split on.\nIf left empty defaults to [\"\\n\\n\", \"\\n\", \" \", \"\"].",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": [
- ""
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Split text into chunks of a specified length.",
- "base_classes": [
- "Record"
- ],
- "display_name": "Recursive Character Text Splitter",
- "documentation": "https://docs.langflow.org/components/text-splitters#recursivecharactertextsplitter",
- "custom_fields": {
- "inputs": null,
- "separators": null,
- "chunk_size": null,
- "chunk_overlap": null
- },
- "output_types": [
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "RecursiveCharacterTextSplitter-tR9QM"
- },
- "selected": false,
- "width": 384,
- "height": 501,
- "positionAbsolute": {
- "x": 2791.013514133929,
- "y": 1462.9588953494142
- },
- "dragging": false
+ "description": "Generates text using OpenAI LLMs.",
+ "icon": "OpenAI",
+ "base_classes": ["object", "Text", "str"],
+ "display_name": "OpenAI",
+ "documentation": "",
+ "custom_fields": {
+ "input_value": null,
+ "openai_api_key": null,
+ "temperature": null,
+ "model_name": null,
+ "max_tokens": null,
+ "model_kwargs": null,
+ "openai_api_base": null,
+ "stream": null,
+ "system_message": null
},
- {
- "id": "AstraDBSearch-41nRz",
- "type": "genericNode",
- "position": {
- "x": 1723.976434815103,
- "y": 277.03317407245913
- },
- "data": {
- "type": "AstraDBSearch",
- "node": {
- "template": {
- "embedding": {
- "type": "Embeddings",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "embedding",
- "display_name": "Embedding",
- "advanced": false,
- "dynamic": false,
- "info": "Embedding to use",
- "load_from_db": false,
- "title_case": false
- },
- "input_value": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Input Value",
- "advanced": false,
- "dynamic": false,
- "info": "Input value to search",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "api_endpoint": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "api_endpoint",
- "display_name": "API Endpoint",
- "advanced": false,
- "dynamic": false,
- "info": "API endpoint URL for the Astra DB service.",
- "load_from_db": true,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "ASTRA_DB_API_ENDPOINT"
- },
- "batch_size": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "batch_size",
- "display_name": "Batch Size",
- "advanced": true,
- "dynamic": false,
- "info": "Optional number of records to process in a single batch.",
- "load_from_db": false,
- "title_case": false
- },
- "bulk_delete_concurrency": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "bulk_delete_concurrency",
- "display_name": "Bulk Delete Concurrency",
- "advanced": true,
- "dynamic": false,
- "info": "Optional concurrency level for bulk delete operations.",
- "load_from_db": false,
- "title_case": false
- },
- "bulk_insert_batch_concurrency": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "bulk_insert_batch_concurrency",
- "display_name": "Bulk Insert Batch Concurrency",
- "advanced": true,
- "dynamic": false,
- "info": "Optional concurrency level for bulk insert operations.",
- "load_from_db": false,
- "title_case": false
- },
- "bulk_insert_overwrite_concurrency": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "bulk_insert_overwrite_concurrency",
- "display_name": "Bulk Insert Overwrite Concurrency",
- "advanced": true,
- "dynamic": false,
- "info": "Optional concurrency level for bulk insert operations that overwrite existing records.",
- "load_from_db": false,
- "title_case": false
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import List, Optional\n\nfrom langflow.components.vectorstores.AstraDB import AstraDBVectorStoreComponent\nfrom langflow.components.vectorstores.base.model import LCVectorStoreComponent\nfrom langflow.field_typing import Embeddings, Text\nfrom langflow.schema import Record\n\n\nclass AstraDBSearchComponent(LCVectorStoreComponent):\n display_name = \"Astra DB Search\"\n description = \"Searches an existing Astra DB Vector Store.\"\n icon = \"AstraDB\"\n field_order = [\"token\", \"api_endpoint\", \"collection_name\", \"input_value\", \"embedding\"]\n\n def build_config(self):\n return {\n \"search_type\": {\n \"display_name\": \"Search Type\",\n \"options\": [\"Similarity\", \"MMR\"],\n },\n \"input_value\": {\n \"display_name\": \"Input Value\",\n \"info\": \"Input value to search\",\n },\n \"embedding\": {\"display_name\": \"Embedding\", \"info\": \"Embedding to use\"},\n \"collection_name\": {\n \"display_name\": \"Collection Name\",\n \"info\": \"The name of the collection within Astra DB where the vectors will be stored.\",\n },\n \"token\": {\n \"display_name\": \"Token\",\n \"info\": \"Authentication token for accessing Astra DB.\",\n \"password\": True,\n },\n \"api_endpoint\": {\n \"display_name\": \"API Endpoint\",\n \"info\": \"API endpoint URL for the Astra DB service.\",\n },\n \"namespace\": {\n \"display_name\": \"Namespace\",\n \"info\": \"Optional namespace within Astra DB to use for the collection.\",\n \"advanced\": True,\n },\n \"metric\": {\n \"display_name\": \"Metric\",\n \"info\": \"Optional distance metric for vector comparisons in the vector store.\",\n \"advanced\": True,\n },\n \"batch_size\": {\n \"display_name\": \"Batch Size\",\n \"info\": \"Optional number of records to process in a single batch.\",\n \"advanced\": True,\n },\n \"bulk_insert_batch_concurrency\": {\n \"display_name\": \"Bulk Insert Batch Concurrency\",\n \"info\": \"Optional concurrency level for bulk insert operations.\",\n \"advanced\": True,\n },\n \"bulk_insert_overwrite_concurrency\": {\n \"display_name\": \"Bulk Insert Overwrite Concurrency\",\n \"info\": \"Optional concurrency level for bulk insert operations that overwrite existing records.\",\n \"advanced\": True,\n },\n \"bulk_delete_concurrency\": {\n \"display_name\": \"Bulk Delete Concurrency\",\n \"info\": \"Optional concurrency level for bulk delete operations.\",\n \"advanced\": True,\n },\n \"setup_mode\": {\n \"display_name\": \"Setup Mode\",\n \"info\": \"Configuration mode for setting up the vector store, with options like \u201cSync\u201d, \u201cAsync\u201d, or \u201cOff\u201d.\",\n \"options\": [\"Sync\", \"Async\", \"Off\"],\n \"advanced\": True,\n },\n \"pre_delete_collection\": {\n \"display_name\": \"Pre Delete Collection\",\n \"info\": \"Boolean flag to determine whether to delete the collection before creating a new one.\",\n \"advanced\": True,\n },\n \"metadata_indexing_include\": {\n \"display_name\": \"Metadata Indexing Include\",\n \"info\": \"Optional list of metadata fields to include in the indexing.\",\n \"advanced\": True,\n },\n \"metadata_indexing_exclude\": {\n \"display_name\": \"Metadata Indexing Exclude\",\n \"info\": \"Optional list of metadata fields to exclude from the indexing.\",\n \"advanced\": True,\n },\n \"collection_indexing_policy\": {\n \"display_name\": \"Collection Indexing Policy\",\n \"info\": \"Optional dictionary defining the indexing policy for the collection.\",\n \"advanced\": True,\n },\n \"number_of_results\": {\n \"display_name\": \"Number of Results\",\n \"info\": \"Number of results to return.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n embedding: Embeddings,\n collection_name: str,\n input_value: Text,\n token: str,\n api_endpoint: str,\n search_type: str = \"Similarity\",\n number_of_results: int = 4,\n namespace: Optional[str] = None,\n metric: Optional[str] = None,\n batch_size: Optional[int] = None,\n bulk_insert_batch_concurrency: Optional[int] = None,\n bulk_insert_overwrite_concurrency: Optional[int] = None,\n bulk_delete_concurrency: Optional[int] = None,\n setup_mode: str = \"Sync\",\n pre_delete_collection: bool = False,\n metadata_indexing_include: Optional[List[str]] = None,\n metadata_indexing_exclude: Optional[List[str]] = None,\n collection_indexing_policy: Optional[dict] = None,\n ) -> List[Record]:\n vector_store = AstraDBVectorStoreComponent().build(\n embedding=embedding,\n collection_name=collection_name,\n token=token,\n api_endpoint=api_endpoint,\n namespace=namespace,\n metric=metric,\n batch_size=batch_size,\n bulk_insert_batch_concurrency=bulk_insert_batch_concurrency,\n bulk_insert_overwrite_concurrency=bulk_insert_overwrite_concurrency,\n bulk_delete_concurrency=bulk_delete_concurrency,\n setup_mode=setup_mode,\n pre_delete_collection=pre_delete_collection,\n metadata_indexing_include=metadata_indexing_include,\n metadata_indexing_exclude=metadata_indexing_exclude,\n collection_indexing_policy=collection_indexing_policy,\n )\n try:\n return self.search_with_vector_store(input_value, search_type, vector_store, k=number_of_results)\n except KeyError as e:\n if \"content\" in str(e):\n raise ValueError(\n \"You should ingest data through Langflow (or LangChain) to query it in Langflow. Your collection does not contain a field name 'content'.\"\n )\n else:\n raise e\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "collection_indexing_policy": {
- "type": "dict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "collection_indexing_policy",
- "display_name": "Collection Indexing Policy",
- "advanced": true,
- "dynamic": false,
- "info": "Optional dictionary defining the indexing policy for the collection.",
- "load_from_db": false,
- "title_case": false
- },
- "collection_name": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "collection_name",
- "display_name": "Collection Name",
- "advanced": false,
- "dynamic": false,
- "info": "The name of the collection within Astra DB where the vectors will be stored.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "langflow"
- },
- "metadata_indexing_exclude": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "metadata_indexing_exclude",
- "display_name": "Metadata Indexing Exclude",
- "advanced": true,
- "dynamic": false,
- "info": "Optional list of metadata fields to exclude from the indexing.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "metadata_indexing_include": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "metadata_indexing_include",
- "display_name": "Metadata Indexing Include",
- "advanced": true,
- "dynamic": false,
- "info": "Optional list of metadata fields to include in the indexing.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "metric": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "metric",
- "display_name": "Metric",
- "advanced": true,
- "dynamic": false,
- "info": "Optional distance metric for vector comparisons in the vector store.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "namespace": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "namespace",
- "display_name": "Namespace",
- "advanced": true,
- "dynamic": false,
- "info": "Optional namespace within Astra DB to use for the collection.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "number_of_results": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 4,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "number_of_results",
- "display_name": "Number of Results",
- "advanced": true,
- "dynamic": false,
- "info": "Number of results to return.",
- "load_from_db": false,
- "title_case": false
- },
- "pre_delete_collection": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "pre_delete_collection",
- "display_name": "Pre Delete Collection",
- "advanced": true,
- "dynamic": false,
- "info": "Boolean flag to determine whether to delete the collection before creating a new one.",
- "load_from_db": false,
- "title_case": false
- },
- "search_type": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "Similarity",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "Similarity",
- "MMR"
- ],
- "name": "search_type",
- "display_name": "Search Type",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "setup_mode": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "Sync",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "Sync",
- "Async",
- "Off"
- ],
- "name": "setup_mode",
- "display_name": "Setup Mode",
- "advanced": true,
- "dynamic": false,
- "info": "Configuration mode for setting up the vector store, with options like \u201cSync\u201d, \u201cAsync\u201d, or \u201cOff\u201d.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "token": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "token",
- "display_name": "Token",
- "advanced": false,
- "dynamic": false,
- "info": "Authentication token for accessing Astra DB.",
- "load_from_db": true,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "ASTRA_DB_APPLICATION_TOKEN"
- },
- "_type": "CustomComponent"
- },
- "description": "Searches an existing Astra DB Vector Store.",
- "icon": "AstraDB",
- "base_classes": [
- "Record"
- ],
- "display_name": "Astra DB Search",
- "documentation": "",
- "custom_fields": {
- "embedding": null,
- "collection_name": null,
- "input_value": null,
- "token": null,
- "api_endpoint": null,
- "search_type": null,
- "number_of_results": null,
- "namespace": null,
- "metric": null,
- "batch_size": null,
- "bulk_insert_batch_concurrency": null,
- "bulk_insert_overwrite_concurrency": null,
- "bulk_delete_concurrency": null,
- "setup_mode": null,
- "pre_delete_collection": null,
- "metadata_indexing_include": null,
- "metadata_indexing_exclude": null,
- "collection_indexing_policy": null
- },
- "output_types": [
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [
- "token",
- "api_endpoint",
- "collection_name",
- "input_value",
- "embedding"
- ],
- "beta": false
- },
- "id": "AstraDBSearch-41nRz"
- },
- "selected": false,
- "width": 384,
- "height": 713,
- "dragging": false,
- "positionAbsolute": {
- "x": 1723.976434815103,
- "y": 277.03317407245913
- }
+ "output_types": ["Text"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [
+ "max_tokens",
+ "model_kwargs",
+ "model_name",
+ "openai_api_base",
+ "openai_api_key",
+ "temperature",
+ "input_value",
+ "system_message",
+ "stream"
+ ],
+ "beta": false
+ },
+ "id": "OpenAIModel-EjXlN"
+ },
+ "selected": true,
+ "width": 384,
+ "height": 563,
+ "positionAbsolute": {
+ "x": 3410.117202077183,
+ "y": 431.2038048137648
+ },
+ "dragging": false
+ },
+ {
+ "id": "Prompt-xeI6K",
+ "type": "genericNode",
+ "position": {
+ "x": 2969.0261961391298,
+ "y": 442.1613649809069
+ },
+ "data": {
+ "type": "Prompt",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from langchain_core.prompts import PromptTemplate\n\nfrom langflow.field_typing import Prompt, TemplateField, Text\nfrom langflow.interface.custom.custom_component import CustomComponent\n\n\nclass PromptComponent(CustomComponent):\n display_name: str = \"Prompt\"\n description: str = \"Create a prompt template with dynamic variables.\"\n icon = \"prompts\"\n\n def build_config(self):\n return {\n \"template\": TemplateField(display_name=\"Template\"),\n \"code\": TemplateField(advanced=True),\n }\n\n def build(\n self,\n template: Prompt,\n **kwargs,\n ) -> Text:\n from langflow.base.prompts.utils import dict_values_to_string\n\n prompt_template = PromptTemplate.from_template(Text(template))\n kwargs = dict_values_to_string(kwargs)\n kwargs = {k: \"\\n\".join(v) if isinstance(v, list) else v for k, v in kwargs.items()}\n try:\n formated_prompt = prompt_template.format(**kwargs)\n except Exception as exc:\n raise ValueError(f\"Error formatting prompt: {exc}\") from exc\n self.status = f'Prompt:\\n\"{formated_prompt}\"'\n return formated_prompt\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "template": {
+ "type": "prompt",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "{context}\n\n---\n\nGiven the context above, answer the question as best as possible.\n\nQuestion: {question}\n\nAnswer: ",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "template",
+ "display_name": "Template",
+ "advanced": false,
+ "input_types": ["Text"],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "_type": "CustomComponent",
+ "context": {
+ "field_type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "context",
+ "display_name": "context",
+ "advanced": false,
+ "input_types": [
+ "Document",
+ "BaseOutputParser",
+ "Record",
+ "Text"
+ ],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "type": "str"
+ },
+ "question": {
+ "field_type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "question",
+ "display_name": "question",
+ "advanced": false,
+ "input_types": [
+ "Document",
+ "BaseOutputParser",
+ "Record",
+ "Text"
+ ],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "type": "str"
+ }
},
- {
- "id": "AstraDB-eUCSS",
- "type": "genericNode",
- "position": {
- "x": 3372.04958055989,
- "y": 1611.0742035495277
- },
- "data": {
- "type": "AstraDB",
- "node": {
- "template": {
- "embedding": {
- "type": "Embeddings",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "embedding",
- "display_name": "Embedding",
- "advanced": false,
- "dynamic": false,
- "info": "Embedding to use",
- "load_from_db": false,
- "title_case": false
- },
- "inputs": {
- "type": "Record",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "inputs",
- "display_name": "Inputs",
- "advanced": false,
- "dynamic": false,
- "info": "Optional list of records to be processed and stored in the vector store.",
- "load_from_db": false,
- "title_case": false
- },
- "api_endpoint": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "api_endpoint",
- "display_name": "API Endpoint",
- "advanced": false,
- "dynamic": false,
- "info": "API endpoint URL for the Astra DB service.",
- "load_from_db": true,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "ASTRA_DB_API_ENDPOINT"
- },
- "batch_size": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "batch_size",
- "display_name": "Batch Size",
- "advanced": true,
- "dynamic": false,
- "info": "Optional number of records to process in a single batch.",
- "load_from_db": false,
- "title_case": false
- },
- "bulk_delete_concurrency": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "bulk_delete_concurrency",
- "display_name": "Bulk Delete Concurrency",
- "advanced": true,
- "dynamic": false,
- "info": "Optional concurrency level for bulk delete operations.",
- "load_from_db": false,
- "title_case": false
- },
- "bulk_insert_batch_concurrency": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "bulk_insert_batch_concurrency",
- "display_name": "Bulk Insert Batch Concurrency",
- "advanced": true,
- "dynamic": false,
- "info": "Optional concurrency level for bulk insert operations.",
- "load_from_db": false,
- "title_case": false
- },
- "bulk_insert_overwrite_concurrency": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "bulk_insert_overwrite_concurrency",
- "display_name": "Bulk Insert Overwrite Concurrency",
- "advanced": true,
- "dynamic": false,
- "info": "Optional concurrency level for bulk insert operations that overwrite existing records.",
- "load_from_db": false,
- "title_case": false
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import List, Optional\n\nfrom langchain_astradb import AstraDBVectorStore\nfrom langchain_astradb.utils.astradb import SetupMode\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Embeddings, VectorStore\nfrom langflow.schema import Record\n\n\nclass AstraDBVectorStoreComponent(CustomComponent):\n display_name = \"Astra DB\"\n description = \"Builds or loads an Astra DB Vector Store.\"\n icon = \"AstraDB\"\n field_order = [\"token\", \"api_endpoint\", \"collection_name\", \"inputs\", \"embedding\"]\n\n def build_config(self):\n return {\n \"inputs\": {\n \"display_name\": \"Inputs\",\n \"info\": \"Optional list of records to be processed and stored in the vector store.\",\n },\n \"embedding\": {\"display_name\": \"Embedding\", \"info\": \"Embedding to use\"},\n \"collection_name\": {\n \"display_name\": \"Collection Name\",\n \"info\": \"The name of the collection within Astra DB where the vectors will be stored.\",\n },\n \"token\": {\n \"display_name\": \"Token\",\n \"info\": \"Authentication token for accessing Astra DB.\",\n \"password\": True,\n },\n \"api_endpoint\": {\n \"display_name\": \"API Endpoint\",\n \"info\": \"API endpoint URL for the Astra DB service.\",\n },\n \"namespace\": {\n \"display_name\": \"Namespace\",\n \"info\": \"Optional namespace within Astra DB to use for the collection.\",\n \"advanced\": True,\n },\n \"metric\": {\n \"display_name\": \"Metric\",\n \"info\": \"Optional distance metric for vector comparisons in the vector store.\",\n \"advanced\": True,\n },\n \"batch_size\": {\n \"display_name\": \"Batch Size\",\n \"info\": \"Optional number of records to process in a single batch.\",\n \"advanced\": True,\n },\n \"bulk_insert_batch_concurrency\": {\n \"display_name\": \"Bulk Insert Batch Concurrency\",\n \"info\": \"Optional concurrency level for bulk insert operations.\",\n \"advanced\": True,\n },\n \"bulk_insert_overwrite_concurrency\": {\n \"display_name\": \"Bulk Insert Overwrite Concurrency\",\n \"info\": \"Optional concurrency level for bulk insert operations that overwrite existing records.\",\n \"advanced\": True,\n },\n \"bulk_delete_concurrency\": {\n \"display_name\": \"Bulk Delete Concurrency\",\n \"info\": \"Optional concurrency level for bulk delete operations.\",\n \"advanced\": True,\n },\n \"setup_mode\": {\n \"display_name\": \"Setup Mode\",\n \"info\": \"Configuration mode for setting up the vector store, with options like \u201cSync\u201d, \u201cAsync\u201d, or \u201cOff\u201d.\",\n \"options\": [\"Sync\", \"Async\", \"Off\"],\n \"advanced\": True,\n },\n \"pre_delete_collection\": {\n \"display_name\": \"Pre Delete Collection\",\n \"info\": \"Boolean flag to determine whether to delete the collection before creating a new one.\",\n \"advanced\": True,\n },\n \"metadata_indexing_include\": {\n \"display_name\": \"Metadata Indexing Include\",\n \"info\": \"Optional list of metadata fields to include in the indexing.\",\n \"advanced\": True,\n },\n \"metadata_indexing_exclude\": {\n \"display_name\": \"Metadata Indexing Exclude\",\n \"info\": \"Optional list of metadata fields to exclude from the indexing.\",\n \"advanced\": True,\n },\n \"collection_indexing_policy\": {\n \"display_name\": \"Collection Indexing Policy\",\n \"info\": \"Optional dictionary defining the indexing policy for the collection.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n embedding: Embeddings,\n token: str,\n api_endpoint: str,\n collection_name: str,\n inputs: Optional[List[Record]] = None,\n namespace: Optional[str] = None,\n metric: Optional[str] = None,\n batch_size: Optional[int] = None,\n bulk_insert_batch_concurrency: Optional[int] = None,\n bulk_insert_overwrite_concurrency: Optional[int] = None,\n bulk_delete_concurrency: Optional[int] = None,\n setup_mode: str = \"Async\",\n pre_delete_collection: bool = False,\n metadata_indexing_include: Optional[List[str]] = None,\n metadata_indexing_exclude: Optional[List[str]] = None,\n collection_indexing_policy: Optional[dict] = None,\n ) -> VectorStore:\n try:\n setup_mode_value = SetupMode[setup_mode.upper()]\n except KeyError:\n raise ValueError(f\"Invalid setup mode: {setup_mode}\")\n if inputs:\n documents = [_input.to_lc_document() for _input in inputs]\n\n vector_store = AstraDBVectorStore.from_documents(\n documents=documents,\n embedding=embedding,\n collection_name=collection_name,\n token=token,\n api_endpoint=api_endpoint,\n namespace=namespace,\n metric=metric,\n batch_size=batch_size,\n bulk_insert_batch_concurrency=bulk_insert_batch_concurrency,\n bulk_insert_overwrite_concurrency=bulk_insert_overwrite_concurrency,\n bulk_delete_concurrency=bulk_delete_concurrency,\n setup_mode=setup_mode_value,\n pre_delete_collection=pre_delete_collection,\n metadata_indexing_include=metadata_indexing_include,\n metadata_indexing_exclude=metadata_indexing_exclude,\n collection_indexing_policy=collection_indexing_policy,\n )\n else:\n vector_store = AstraDBVectorStore(\n embedding=embedding,\n collection_name=collection_name,\n token=token,\n api_endpoint=api_endpoint,\n namespace=namespace,\n metric=metric,\n batch_size=batch_size,\n bulk_insert_batch_concurrency=bulk_insert_batch_concurrency,\n bulk_insert_overwrite_concurrency=bulk_insert_overwrite_concurrency,\n bulk_delete_concurrency=bulk_delete_concurrency,\n setup_mode=setup_mode_value,\n pre_delete_collection=pre_delete_collection,\n metadata_indexing_include=metadata_indexing_include,\n metadata_indexing_exclude=metadata_indexing_exclude,\n collection_indexing_policy=collection_indexing_policy,\n )\n\n return vector_store\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "collection_indexing_policy": {
- "type": "dict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "collection_indexing_policy",
- "display_name": "Collection Indexing Policy",
- "advanced": true,
- "dynamic": false,
- "info": "Optional dictionary defining the indexing policy for the collection.",
- "load_from_db": false,
- "title_case": false
- },
- "collection_name": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "collection_name",
- "display_name": "Collection Name",
- "advanced": false,
- "dynamic": false,
- "info": "The name of the collection within Astra DB where the vectors will be stored.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "langflow"
- },
- "metadata_indexing_exclude": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "metadata_indexing_exclude",
- "display_name": "Metadata Indexing Exclude",
- "advanced": true,
- "dynamic": false,
- "info": "Optional list of metadata fields to exclude from the indexing.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "metadata_indexing_include": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "metadata_indexing_include",
- "display_name": "Metadata Indexing Include",
- "advanced": true,
- "dynamic": false,
- "info": "Optional list of metadata fields to include in the indexing.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "metric": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "metric",
- "display_name": "Metric",
- "advanced": true,
- "dynamic": false,
- "info": "Optional distance metric for vector comparisons in the vector store.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "namespace": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "namespace",
- "display_name": "Namespace",
- "advanced": true,
- "dynamic": false,
- "info": "Optional namespace within Astra DB to use for the collection.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "pre_delete_collection": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "pre_delete_collection",
- "display_name": "Pre Delete Collection",
- "advanced": true,
- "dynamic": false,
- "info": "Boolean flag to determine whether to delete the collection before creating a new one.",
- "load_from_db": false,
- "title_case": false
- },
- "setup_mode": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "Async",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "Sync",
- "Async",
- "Off"
- ],
- "name": "setup_mode",
- "display_name": "Setup Mode",
- "advanced": true,
- "dynamic": false,
- "info": "Configuration mode for setting up the vector store, with options like \u201cSync\u201d, \u201cAsync\u201d, or \u201cOff\u201d.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "token": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "token",
- "display_name": "Token",
- "advanced": false,
- "dynamic": false,
- "info": "Authentication token for accessing Astra DB.",
- "load_from_db": true,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "ASTRA_DB_APPLICATION_TOKEN"
- },
- "_type": "CustomComponent"
- },
- "description": "Builds or loads an Astra DB Vector Store.",
- "icon": "AstraDB",
- "base_classes": [
- "VectorStore"
- ],
- "display_name": "Astra DB",
- "documentation": "",
- "custom_fields": {
- "embedding": null,
- "token": null,
- "api_endpoint": null,
- "collection_name": null,
- "inputs": null,
- "namespace": null,
- "metric": null,
- "batch_size": null,
- "bulk_insert_batch_concurrency": null,
- "bulk_insert_overwrite_concurrency": null,
- "bulk_delete_concurrency": null,
- "setup_mode": null,
- "pre_delete_collection": null,
- "metadata_indexing_include": null,
- "metadata_indexing_exclude": null,
- "collection_indexing_policy": null
- },
- "output_types": [
- "VectorStore"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [
- "token",
- "api_endpoint",
- "collection_name",
- "inputs",
- "embedding"
- ],
- "beta": false
- },
- "id": "AstraDB-eUCSS"
- },
- "selected": false,
- "width": 384,
- "height": 573,
- "positionAbsolute": {
- "x": 3372.04958055989,
- "y": 1611.0742035495277
- },
- "dragging": false
+ "description": "Create a prompt template with dynamic variables.",
+ "icon": "prompts",
+ "is_input": null,
+ "is_output": null,
+ "is_composition": null,
+ "base_classes": ["object", "Text", "str"],
+ "name": "",
+ "display_name": "Prompt",
+ "documentation": "",
+ "custom_fields": {
+ "template": ["context", "question"]
},
- {
- "id": "OpenAIEmbeddings-9TPjc",
- "type": "genericNode",
- "position": {
- "x": 2814.0402191223047,
- "y": 1955.9268168273086
- },
- "data": {
- "type": "OpenAIEmbeddings",
- "node": {
- "template": {
- "allowed_special": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": [],
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "allowed_special",
- "display_name": "Allowed Special",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "chunk_size": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 1000,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "chunk_size",
- "display_name": "Chunk Size",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "client": {
- "type": "Any",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "client",
- "display_name": "Client",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Any, Dict, List, Optional\n\nfrom langchain_openai.embeddings.base import OpenAIEmbeddings\n\nfrom langflow.field_typing import Embeddings, NestedDict\nfrom langflow.interface.custom.custom_component import CustomComponent\n\n\nclass OpenAIEmbeddingsComponent(CustomComponent):\n display_name = \"OpenAI Embeddings\"\n description = \"Generate embeddings using OpenAI models.\"\n\n def build_config(self):\n return {\n \"allowed_special\": {\n \"display_name\": \"Allowed Special\",\n \"advanced\": True,\n \"field_type\": \"str\",\n \"is_list\": True,\n },\n \"default_headers\": {\n \"display_name\": \"Default Headers\",\n \"advanced\": True,\n \"field_type\": \"dict\",\n },\n \"default_query\": {\n \"display_name\": \"Default Query\",\n \"advanced\": True,\n \"field_type\": \"NestedDict\",\n },\n \"disallowed_special\": {\n \"display_name\": \"Disallowed Special\",\n \"advanced\": True,\n \"field_type\": \"str\",\n \"is_list\": True,\n },\n \"chunk_size\": {\"display_name\": \"Chunk Size\", \"advanced\": True},\n \"client\": {\"display_name\": \"Client\", \"advanced\": True},\n \"deployment\": {\"display_name\": \"Deployment\", \"advanced\": True},\n \"embedding_ctx_length\": {\n \"display_name\": \"Embedding Context Length\",\n \"advanced\": True,\n },\n \"max_retries\": {\"display_name\": \"Max Retries\", \"advanced\": True},\n \"model\": {\n \"display_name\": \"Model\",\n \"advanced\": False,\n \"options\": [\n \"text-embedding-3-small\",\n \"text-embedding-3-large\",\n \"text-embedding-ada-002\",\n ],\n },\n \"model_kwargs\": {\"display_name\": \"Model Kwargs\", \"advanced\": True},\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"password\": True,\n \"advanced\": True,\n },\n \"openai_api_key\": {\"display_name\": \"OpenAI API Key\", \"password\": True},\n \"openai_api_type\": {\n \"display_name\": \"OpenAI API Type\",\n \"advanced\": True,\n \"password\": True,\n },\n \"openai_api_version\": {\n \"display_name\": \"OpenAI API Version\",\n \"advanced\": True,\n },\n \"openai_organization\": {\n \"display_name\": \"OpenAI Organization\",\n \"advanced\": True,\n },\n \"openai_proxy\": {\"display_name\": \"OpenAI Proxy\", \"advanced\": True},\n \"request_timeout\": {\"display_name\": \"Request Timeout\", \"advanced\": True},\n \"show_progress_bar\": {\n \"display_name\": \"Show Progress Bar\",\n \"advanced\": True,\n },\n \"skip_empty\": {\"display_name\": \"Skip Empty\", \"advanced\": True},\n \"tiktoken_model_name\": {\n \"display_name\": \"TikToken Model Name\",\n \"advanced\": True,\n },\n \"tiktoken_enable\": {\"display_name\": \"TikToken Enable\", \"advanced\": True},\n }\n\n def build(\n self,\n openai_api_key: str,\n default_headers: Optional[Dict[str, str]] = None,\n default_query: Optional[NestedDict] = {},\n allowed_special: List[str] = [],\n disallowed_special: List[str] = [\"all\"],\n chunk_size: int = 1000,\n client: Optional[Any] = None,\n deployment: str = \"text-embedding-ada-002\",\n embedding_ctx_length: int = 8191,\n max_retries: int = 6,\n model: str = \"text-embedding-ada-002\",\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n openai_api_type: Optional[str] = None,\n openai_api_version: Optional[str] = None,\n openai_organization: Optional[str] = None,\n openai_proxy: Optional[str] = None,\n request_timeout: Optional[float] = None,\n show_progress_bar: bool = False,\n skip_empty: bool = False,\n tiktoken_enable: bool = True,\n tiktoken_model_name: Optional[str] = None,\n ) -> Embeddings:\n # This is to avoid errors with Vector Stores (e.g Chroma)\n if disallowed_special == [\"all\"]:\n disallowed_special = \"all\" # type: ignore\n\n return OpenAIEmbeddings(\n tiktoken_enabled=tiktoken_enable,\n default_headers=default_headers,\n default_query=default_query,\n allowed_special=set(allowed_special),\n disallowed_special=\"all\",\n chunk_size=chunk_size,\n client=client,\n deployment=deployment,\n embedding_ctx_length=embedding_ctx_length,\n max_retries=max_retries,\n model=model,\n model_kwargs=model_kwargs,\n base_url=openai_api_base,\n api_key=openai_api_key,\n openai_api_type=openai_api_type,\n api_version=openai_api_version,\n organization=openai_organization,\n openai_proxy=openai_proxy,\n timeout=request_timeout,\n show_progress_bar=show_progress_bar,\n skip_empty=skip_empty,\n tiktoken_model_name=tiktoken_model_name,\n )\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "default_headers": {
- "type": "dict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "default_headers",
- "display_name": "Default Headers",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "default_query": {
- "type": "NestedDict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": {},
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "default_query",
- "display_name": "Default Query",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "deployment": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "text-embedding-ada-002",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "deployment",
- "display_name": "Deployment",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "disallowed_special": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": [
- "all"
- ],
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "disallowed_special",
- "display_name": "Disallowed Special",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "embedding_ctx_length": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 8191,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "embedding_ctx_length",
- "display_name": "Embedding Context Length",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "max_retries": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 6,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "max_retries",
- "display_name": "Max Retries",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "model": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "text-embedding-ada-002",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "text-embedding-3-small",
- "text-embedding-3-large",
- "text-embedding-ada-002"
- ],
- "name": "model",
- "display_name": "Model",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "model_kwargs": {
- "type": "NestedDict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": {},
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "model_kwargs",
- "display_name": "Model Kwargs",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "openai_api_base": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "openai_api_base",
- "display_name": "OpenAI API Base",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_key": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "openai_api_key",
- "display_name": "OpenAI API Key",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": ""
- },
- "openai_api_type": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "openai_api_type",
- "display_name": "OpenAI API Type",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_version": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "openai_api_version",
- "display_name": "OpenAI API Version",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_organization": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "openai_organization",
- "display_name": "OpenAI Organization",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_proxy": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "openai_proxy",
- "display_name": "OpenAI Proxy",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "request_timeout": {
- "type": "float",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "request_timeout",
- "display_name": "Request Timeout",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "rangeSpec": {
- "step_type": "float",
- "min": -1,
- "max": 1,
- "step": 0.1
- },
- "load_from_db": false,
- "title_case": false
- },
- "show_progress_bar": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "show_progress_bar",
- "display_name": "Show Progress Bar",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "skip_empty": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "skip_empty",
- "display_name": "Skip Empty",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "tiktoken_enable": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": true,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "tiktoken_enable",
- "display_name": "TikToken Enable",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "tiktoken_model_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "tiktoken_model_name",
- "display_name": "TikToken Model Name",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Generate embeddings using OpenAI models.",
- "base_classes": [
- "Embeddings"
- ],
- "display_name": "OpenAI Embeddings",
- "documentation": "",
- "custom_fields": {
- "openai_api_key": null,
- "default_headers": null,
- "default_query": null,
- "allowed_special": null,
- "disallowed_special": null,
- "chunk_size": null,
- "client": null,
- "deployment": null,
- "embedding_ctx_length": null,
- "max_retries": null,
- "model": null,
- "model_kwargs": null,
- "openai_api_base": null,
- "openai_api_type": null,
- "openai_api_version": null,
- "openai_organization": null,
- "openai_proxy": null,
- "request_timeout": null,
- "show_progress_bar": null,
- "skip_empty": null,
- "tiktoken_enable": null,
- "tiktoken_model_name": null
- },
- "output_types": [
- "Embeddings"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "OpenAIEmbeddings-9TPjc"
- },
- "selected": false,
- "width": 384,
- "height": 383,
- "positionAbsolute": {
- "x": 2814.0402191223047,
- "y": 1955.9268168273086
- },
- "dragging": false
- }
- ],
- "edges": [
- {
- "source": "TextOutput-BDknO",
- "target": "Prompt-xeI6K",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153TextOutput\u0153,\u0153id\u0153:\u0153TextOutput-BDknO\u0153}",
- "targetHandle": "{\u0153fieldName\u0153:\u0153context\u0153,\u0153id\u0153:\u0153Prompt-xeI6K\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "id": "reactflow__edge-TextOutput-BDknO{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153TextOutput\u0153,\u0153id\u0153:\u0153TextOutput-BDknO\u0153}-Prompt-xeI6K{\u0153fieldName\u0153:\u0153context\u0153,\u0153id\u0153:\u0153Prompt-xeI6K\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "context",
- "id": "Prompt-xeI6K",
- "inputTypes": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "object",
- "Text",
- "str"
- ],
- "dataType": "TextOutput",
- "id": "TextOutput-BDknO"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "selected": false
+ "output_types": ["Text"],
+ "full_path": null,
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false,
+ "error": null
+ },
+ "id": "Prompt-xeI6K",
+ "description": "Create a prompt template with dynamic variables.",
+ "display_name": "Prompt"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 477,
+ "positionAbsolute": {
+ "x": 2969.0261961391298,
+ "y": 442.1613649809069
+ },
+ "dragging": false
+ },
+ {
+ "id": "ChatOutput-Q39I8",
+ "type": "genericNode",
+ "position": {
+ "x": 3887.2073667611485,
+ "y": 588.4801225794856
+ },
+ "data": {
+ "type": "ChatOutput",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n\n def build(\n self,\n sender: Optional[str] = \"Machine\",\n sender_name: Optional[str] = \"AI\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n record_template: Optional[str] = \"{text}\",\n ) -> Union[Text, Record]:\n return super().build(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n record_template=record_template,\n )\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "input_value": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Message",
+ "advanced": false,
+ "input_types": ["Text"],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "record_template": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "{text}",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "record_template",
+ "display_name": "Record Template",
+ "advanced": true,
+ "dynamic": false,
+ "info": "In case of Message being a Record, this template will be used to convert it to text.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "return_record": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "return_record",
+ "display_name": "Return Record",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Return the message as a record containing the sender, sender_name, and session_id.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "sender": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "Machine",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["Machine", "User"],
+ "name": "sender",
+ "display_name": "Sender Type",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "sender_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "AI",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "sender_name",
+ "display_name": "Sender Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "session_id": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "session_id",
+ "display_name": "Session ID",
+ "advanced": true,
+ "dynamic": false,
+ "info": "If provided, the message will be stored in the memory.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
},
- {
- "source": "ChatInput-yxMKE",
- "target": "Prompt-xeI6K",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Text\u0153,\u0153str\u0153,\u0153object\u0153,\u0153Record\u0153],\u0153dataType\u0153:\u0153ChatInput\u0153,\u0153id\u0153:\u0153ChatInput-yxMKE\u0153}",
- "targetHandle": "{\u0153fieldName\u0153:\u0153question\u0153,\u0153id\u0153:\u0153Prompt-xeI6K\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "id": "reactflow__edge-ChatInput-yxMKE{\u0153baseClasses\u0153:[\u0153Text\u0153,\u0153str\u0153,\u0153object\u0153,\u0153Record\u0153],\u0153dataType\u0153:\u0153ChatInput\u0153,\u0153id\u0153:\u0153ChatInput-yxMKE\u0153}-Prompt-xeI6K{\u0153fieldName\u0153:\u0153question\u0153,\u0153id\u0153:\u0153Prompt-xeI6K\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "question",
- "id": "Prompt-xeI6K",
- "inputTypes": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "Text",
- "str",
- "object",
- "Record"
- ],
- "dataType": "ChatInput",
- "id": "ChatInput-yxMKE"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "selected": false
+ "description": "Display a chat message in the Playground.",
+ "icon": "ChatOutput",
+ "base_classes": ["object", "Text", "Record", "str"],
+ "display_name": "Chat Output",
+ "documentation": "",
+ "custom_fields": {
+ "sender": null,
+ "sender_name": null,
+ "input_value": null,
+ "session_id": null,
+ "return_record": null,
+ "record_template": null
},
- {
- "source": "Prompt-xeI6K",
- "target": "OpenAIModel-EjXlN",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-xeI6K\u0153}",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153OpenAIModel-EjXlN\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "id": "reactflow__edge-Prompt-xeI6K{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-xeI6K\u0153}-OpenAIModel-EjXlN{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153OpenAIModel-EjXlN\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "OpenAIModel-EjXlN",
- "inputTypes": [
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "object",
- "Text",
- "str"
- ],
- "dataType": "Prompt",
- "id": "Prompt-xeI6K"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "selected": false
+ "output_types": ["Text", "Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "ChatOutput-Q39I8"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 383,
+ "positionAbsolute": {
+ "x": 3887.2073667611485,
+ "y": 588.4801225794856
+ },
+ "dragging": false
+ },
+ {
+ "id": "File-t0a6a",
+ "type": "genericNode",
+ "position": {
+ "x": 2257.233450682836,
+ "y": 1747.5389618367233
+ },
+ "data": {
+ "type": "File",
+ "node": {
+ "template": {
+ "path": {
+ "type": "file",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [
+ ".txt",
+ ".md",
+ ".mdx",
+ ".csv",
+ ".json",
+ ".yaml",
+ ".yml",
+ ".xml",
+ ".html",
+ ".htm",
+ ".pdf",
+ ".docx"
+ ],
+ "file_path": "51e2b78a-199b-4054-9f32-e288eef6924c/Langflow conversation.pdf",
+ "password": false,
+ "name": "path",
+ "display_name": "Path",
+ "advanced": false,
+ "dynamic": false,
+ "info": "Supported file types: txt, md, mdx, csv, json, yaml, yml, xml, html, htm, pdf, docx",
+ "load_from_db": false,
+ "title_case": false,
+ "value": ""
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from pathlib import Path\nfrom typing import Any, Dict\n\nfrom langflow.base.data.utils import TEXT_FILE_TYPES, parse_text_file_to_record\nfrom langflow.interface.custom.custom_component import CustomComponent\nfrom langflow.schema import Record\n\n\nclass FileComponent(CustomComponent):\n display_name = \"File\"\n description = \"A generic file loader.\"\n icon = \"file-text\"\n\n def build_config(self) -> Dict[str, Any]:\n return {\n \"path\": {\n \"display_name\": \"Path\",\n \"field_type\": \"file\",\n \"file_types\": TEXT_FILE_TYPES,\n \"info\": f\"Supported file types: {', '.join(TEXT_FILE_TYPES)}\",\n },\n \"silent_errors\": {\n \"display_name\": \"Silent Errors\",\n \"advanced\": True,\n \"info\": \"If true, errors will not raise an exception.\",\n },\n }\n\n def load_file(self, path: str, silent_errors: bool = False) -> Record:\n resolved_path = self.resolve_path(path)\n path_obj = Path(resolved_path)\n extension = path_obj.suffix[1:].lower()\n if extension == \"doc\":\n raise ValueError(\"doc files are not supported. Please save as .docx\")\n if extension not in TEXT_FILE_TYPES:\n raise ValueError(f\"Unsupported file type: {extension}\")\n record = parse_text_file_to_record(resolved_path, silent_errors)\n self.status = record if record else \"No data\"\n return record or Record()\n\n def build(\n self,\n path: str,\n silent_errors: bool = False,\n ) -> Record:\n record = self.load_file(path, silent_errors)\n self.status = record\n return record\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "silent_errors": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "silent_errors",
+ "display_name": "Silent Errors",
+ "advanced": true,
+ "dynamic": false,
+ "info": "If true, errors will not raise an exception.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "_type": "CustomComponent"
},
- {
- "source": "OpenAIModel-EjXlN",
- "target": "ChatOutput-Q39I8",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153OpenAIModel\u0153,\u0153id\u0153:\u0153OpenAIModel-EjXlN\u0153}",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153ChatOutput-Q39I8\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "id": "reactflow__edge-OpenAIModel-EjXlN{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153OpenAIModel\u0153,\u0153id\u0153:\u0153OpenAIModel-EjXlN\u0153}-ChatOutput-Q39I8{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153ChatOutput-Q39I8\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "ChatOutput-Q39I8",
- "inputTypes": [
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "object",
- "Text",
- "str"
- ],
- "dataType": "OpenAIModel",
- "id": "OpenAIModel-EjXlN"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "selected": false
+ "description": "A generic file loader.",
+ "icon": "file-text",
+ "base_classes": ["Record"],
+ "display_name": "File",
+ "documentation": "",
+ "custom_fields": {
+ "path": null,
+ "silent_errors": null
},
- {
- "source": "File-t0a6a",
- "target": "RecursiveCharacterTextSplitter-tR9QM",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153File\u0153,\u0153id\u0153:\u0153File-t0a6a\u0153}",
- "targetHandle": "{\u0153fieldName\u0153:\u0153inputs\u0153,\u0153id\u0153:\u0153RecursiveCharacterTextSplitter-tR9QM\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153Record\u0153],\u0153type\u0153:\u0153Document\u0153}",
- "id": "reactflow__edge-File-t0a6a{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153File\u0153,\u0153id\u0153:\u0153File-t0a6a\u0153}-RecursiveCharacterTextSplitter-tR9QM{\u0153fieldName\u0153:\u0153inputs\u0153,\u0153id\u0153:\u0153RecursiveCharacterTextSplitter-tR9QM\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153Record\u0153],\u0153type\u0153:\u0153Document\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "inputs",
- "id": "RecursiveCharacterTextSplitter-tR9QM",
- "inputTypes": [
- "Document",
- "Record"
- ],
- "type": "Document"
- },
- "sourceHandle": {
- "baseClasses": [
- "Record"
- ],
- "dataType": "File",
- "id": "File-t0a6a"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "selected": false
+ "output_types": ["Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "File-t0a6a"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 281,
+ "positionAbsolute": {
+ "x": 2257.233450682836,
+ "y": 1747.5389618367233
+ },
+ "dragging": false
+ },
+ {
+ "id": "RecursiveCharacterTextSplitter-tR9QM",
+ "type": "genericNode",
+ "position": {
+ "x": 2791.013514133929,
+ "y": 1462.9588953494142
+ },
+ "data": {
+ "type": "RecursiveCharacterTextSplitter",
+ "node": {
+ "template": {
+ "inputs": {
+ "type": "Document",
+ "required": true,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "inputs",
+ "display_name": "Input",
+ "advanced": false,
+ "input_types": ["Document", "Record"],
+ "dynamic": false,
+ "info": "The texts to split.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "chunk_overlap": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 200,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "chunk_overlap",
+ "display_name": "Chunk Overlap",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The amount of overlap between chunks.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "chunk_size": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 1000,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "chunk_size",
+ "display_name": "Chunk Size",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The maximum length of each chunk.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional\n\nfrom langchain.text_splitter import RecursiveCharacterTextSplitter\nfrom langchain_core.documents import Document\n\nfrom langflow.interface.custom.custom_component import CustomComponent\nfrom langflow.schema import Record\nfrom langflow.utils.util import build_loader_repr_from_records, unescape_string\n\n\nclass RecursiveCharacterTextSplitterComponent(CustomComponent):\n display_name: str = \"Recursive Character Text Splitter\"\n description: str = \"Split text into chunks of a specified length.\"\n documentation: str = \"https://docs.langflow.org/components/text-splitters#recursivecharactertextsplitter\"\n\n def build_config(self):\n return {\n \"inputs\": {\n \"display_name\": \"Input\",\n \"info\": \"The texts to split.\",\n \"input_types\": [\"Document\", \"Record\"],\n },\n \"separators\": {\n \"display_name\": \"Separators\",\n \"info\": 'The characters to split on.\\nIf left empty defaults to [\"\\\\n\\\\n\", \"\\\\n\", \" \", \"\"].',\n \"is_list\": True,\n },\n \"chunk_size\": {\n \"display_name\": \"Chunk Size\",\n \"info\": \"The maximum length of each chunk.\",\n \"field_type\": \"int\",\n \"value\": 1000,\n },\n \"chunk_overlap\": {\n \"display_name\": \"Chunk Overlap\",\n \"info\": \"The amount of overlap between chunks.\",\n \"field_type\": \"int\",\n \"value\": 200,\n },\n \"code\": {\"show\": False},\n }\n\n def build(\n self,\n inputs: list[Document],\n separators: Optional[list[str]] = None,\n chunk_size: Optional[int] = 1000,\n chunk_overlap: Optional[int] = 200,\n ) -> list[Record]:\n \"\"\"\n Split text into chunks of a specified length.\n\n Args:\n separators (list[str]): The characters to split on.\n chunk_size (int): The maximum length of each chunk.\n chunk_overlap (int): The amount of overlap between chunks.\n length_function (function): The function to use to calculate the length of the text.\n\n Returns:\n list[str]: The chunks of text.\n \"\"\"\n\n if separators == \"\":\n separators = None\n elif separators:\n # check if the separators list has escaped characters\n # if there are escaped characters, unescape them\n separators = [unescape_string(x) for x in separators]\n\n # Make sure chunk_size and chunk_overlap are ints\n if isinstance(chunk_size, str):\n chunk_size = int(chunk_size)\n if isinstance(chunk_overlap, str):\n chunk_overlap = int(chunk_overlap)\n splitter = RecursiveCharacterTextSplitter(\n separators=separators,\n chunk_size=chunk_size,\n chunk_overlap=chunk_overlap,\n )\n documents = []\n for _input in inputs:\n if isinstance(_input, Record):\n documents.append(_input.to_lc_document())\n else:\n documents.append(_input)\n docs = splitter.split_documents(documents)\n records = self.to_records(docs)\n self.repr_value = build_loader_repr_from_records(records)\n return records\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "separators": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "separators",
+ "display_name": "Separators",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The characters to split on.\nIf left empty defaults to [\"\\n\\n\", \"\\n\", \" \", \"\"].",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": [""]
+ },
+ "_type": "CustomComponent"
},
- {
- "source": "OpenAIEmbeddings-ZlOk1",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Embeddings\u0153],\u0153dataType\u0153:\u0153OpenAIEmbeddings\u0153,\u0153id\u0153:\u0153OpenAIEmbeddings-ZlOk1\u0153}",
- "target": "AstraDBSearch-41nRz",
- "targetHandle": "{\u0153fieldName\u0153:\u0153embedding\u0153,\u0153id\u0153:\u0153AstraDBSearch-41nRz\u0153,\u0153inputTypes\u0153:null,\u0153type\u0153:\u0153Embeddings\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "embedding",
- "id": "AstraDBSearch-41nRz",
- "inputTypes": null,
- "type": "Embeddings"
- },
- "sourceHandle": {
- "baseClasses": [
- "Embeddings"
- ],
- "dataType": "OpenAIEmbeddings",
- "id": "OpenAIEmbeddings-ZlOk1"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-OpenAIEmbeddings-ZlOk1{\u0153baseClasses\u0153:[\u0153Embeddings\u0153],\u0153dataType\u0153:\u0153OpenAIEmbeddings\u0153,\u0153id\u0153:\u0153OpenAIEmbeddings-ZlOk1\u0153}-AstraDBSearch-41nRz{\u0153fieldName\u0153:\u0153embedding\u0153,\u0153id\u0153:\u0153AstraDBSearch-41nRz\u0153,\u0153inputTypes\u0153:null,\u0153type\u0153:\u0153Embeddings\u0153}"
+ "description": "Split text into chunks of a specified length.",
+ "base_classes": ["Record"],
+ "display_name": "Recursive Character Text Splitter",
+ "documentation": "https://docs.langflow.org/components/text-splitters#recursivecharactertextsplitter",
+ "custom_fields": {
+ "inputs": null,
+ "separators": null,
+ "chunk_size": null,
+ "chunk_overlap": null
},
- {
- "source": "ChatInput-yxMKE",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Text\u0153,\u0153str\u0153,\u0153object\u0153,\u0153Record\u0153],\u0153dataType\u0153:\u0153ChatInput\u0153,\u0153id\u0153:\u0153ChatInput-yxMKE\u0153}",
- "target": "AstraDBSearch-41nRz",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153AstraDBSearch-41nRz\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "AstraDBSearch-41nRz",
- "inputTypes": [
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "Text",
- "str",
- "object",
- "Record"
- ],
- "dataType": "ChatInput",
- "id": "ChatInput-yxMKE"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-ChatInput-yxMKE{\u0153baseClasses\u0153:[\u0153Text\u0153,\u0153str\u0153,\u0153object\u0153,\u0153Record\u0153],\u0153dataType\u0153:\u0153ChatInput\u0153,\u0153id\u0153:\u0153ChatInput-yxMKE\u0153}-AstraDBSearch-41nRz{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153AstraDBSearch-41nRz\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
+ "output_types": ["Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "RecursiveCharacterTextSplitter-tR9QM"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 501,
+ "positionAbsolute": {
+ "x": 2791.013514133929,
+ "y": 1462.9588953494142
+ },
+ "dragging": false
+ },
+ {
+ "id": "AstraDBSearch-41nRz",
+ "type": "genericNode",
+ "position": {
+ "x": 1723.976434815103,
+ "y": 277.03317407245913
+ },
+ "data": {
+ "type": "AstraDBSearch",
+ "node": {
+ "template": {
+ "embedding": {
+ "type": "Embeddings",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "embedding",
+ "display_name": "Embedding",
+ "advanced": false,
+ "dynamic": false,
+ "info": "Embedding to use",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "input_value": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Input Value",
+ "advanced": false,
+ "dynamic": false,
+ "info": "Input value to search",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "api_endpoint": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "api_endpoint",
+ "display_name": "API Endpoint",
+ "advanced": false,
+ "dynamic": false,
+ "info": "API endpoint URL for the Astra DB service.",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "ASTRA_DB_API_ENDPOINT"
+ },
+ "batch_size": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "batch_size",
+ "display_name": "Batch Size",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional number of records to process in a single batch.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "bulk_delete_concurrency": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "bulk_delete_concurrency",
+ "display_name": "Bulk Delete Concurrency",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional concurrency level for bulk delete operations.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "bulk_insert_batch_concurrency": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "bulk_insert_batch_concurrency",
+ "display_name": "Bulk Insert Batch Concurrency",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional concurrency level for bulk insert operations.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "bulk_insert_overwrite_concurrency": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "bulk_insert_overwrite_concurrency",
+ "display_name": "Bulk Insert Overwrite Concurrency",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional concurrency level for bulk insert operations that overwrite existing records.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import List, Optional\n\nfrom langflow.components.vectorstores.AstraDB import AstraDBVectorStoreComponent\nfrom langflow.components.vectorstores.base.model import LCVectorStoreComponent\nfrom langflow.field_typing import Embeddings, Text\nfrom langflow.schema import Record\n\n\nclass AstraDBSearchComponent(LCVectorStoreComponent):\n display_name = \"Astra DB Search\"\n description = \"Searches an existing Astra DB Vector Store.\"\n icon = \"AstraDB\"\n field_order = [\"token\", \"api_endpoint\", \"collection_name\", \"input_value\", \"embedding\"]\n\n def build_config(self):\n return {\n \"search_type\": {\n \"display_name\": \"Search Type\",\n \"options\": [\"Similarity\", \"MMR\"],\n },\n \"input_value\": {\n \"display_name\": \"Input Value\",\n \"info\": \"Input value to search\",\n },\n \"embedding\": {\"display_name\": \"Embedding\", \"info\": \"Embedding to use\"},\n \"collection_name\": {\n \"display_name\": \"Collection Name\",\n \"info\": \"The name of the collection within Astra DB where the vectors will be stored.\",\n },\n \"token\": {\n \"display_name\": \"Token\",\n \"info\": \"Authentication token for accessing Astra DB.\",\n \"password\": True,\n },\n \"api_endpoint\": {\n \"display_name\": \"API Endpoint\",\n \"info\": \"API endpoint URL for the Astra DB service.\",\n },\n \"namespace\": {\n \"display_name\": \"Namespace\",\n \"info\": \"Optional namespace within Astra DB to use for the collection.\",\n \"advanced\": True,\n },\n \"metric\": {\n \"display_name\": \"Metric\",\n \"info\": \"Optional distance metric for vector comparisons in the vector store.\",\n \"advanced\": True,\n },\n \"batch_size\": {\n \"display_name\": \"Batch Size\",\n \"info\": \"Optional number of records to process in a single batch.\",\n \"advanced\": True,\n },\n \"bulk_insert_batch_concurrency\": {\n \"display_name\": \"Bulk Insert Batch Concurrency\",\n \"info\": \"Optional concurrency level for bulk insert operations.\",\n \"advanced\": True,\n },\n \"bulk_insert_overwrite_concurrency\": {\n \"display_name\": \"Bulk Insert Overwrite Concurrency\",\n \"info\": \"Optional concurrency level for bulk insert operations that overwrite existing records.\",\n \"advanced\": True,\n },\n \"bulk_delete_concurrency\": {\n \"display_name\": \"Bulk Delete Concurrency\",\n \"info\": \"Optional concurrency level for bulk delete operations.\",\n \"advanced\": True,\n },\n \"setup_mode\": {\n \"display_name\": \"Setup Mode\",\n \"info\": \"Configuration mode for setting up the vector store, with options like \u201cSync\u201d, \u201cAsync\u201d, or \u201cOff\u201d.\",\n \"options\": [\"Sync\", \"Async\", \"Off\"],\n \"advanced\": True,\n },\n \"pre_delete_collection\": {\n \"display_name\": \"Pre Delete Collection\",\n \"info\": \"Boolean flag to determine whether to delete the collection before creating a new one.\",\n \"advanced\": True,\n },\n \"metadata_indexing_include\": {\n \"display_name\": \"Metadata Indexing Include\",\n \"info\": \"Optional list of metadata fields to include in the indexing.\",\n \"advanced\": True,\n },\n \"metadata_indexing_exclude\": {\n \"display_name\": \"Metadata Indexing Exclude\",\n \"info\": \"Optional list of metadata fields to exclude from the indexing.\",\n \"advanced\": True,\n },\n \"collection_indexing_policy\": {\n \"display_name\": \"Collection Indexing Policy\",\n \"info\": \"Optional dictionary defining the indexing policy for the collection.\",\n \"advanced\": True,\n },\n \"number_of_results\": {\n \"display_name\": \"Number of Results\",\n \"info\": \"Number of results to return.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n embedding: Embeddings,\n collection_name: str,\n input_value: Text,\n token: str,\n api_endpoint: str,\n search_type: str = \"Similarity\",\n number_of_results: int = 4,\n namespace: Optional[str] = None,\n metric: Optional[str] = None,\n batch_size: Optional[int] = None,\n bulk_insert_batch_concurrency: Optional[int] = None,\n bulk_insert_overwrite_concurrency: Optional[int] = None,\n bulk_delete_concurrency: Optional[int] = None,\n setup_mode: str = \"Sync\",\n pre_delete_collection: bool = False,\n metadata_indexing_include: Optional[List[str]] = None,\n metadata_indexing_exclude: Optional[List[str]] = None,\n collection_indexing_policy: Optional[dict] = None,\n ) -> List[Record]:\n vector_store = AstraDBVectorStoreComponent().build(\n embedding=embedding,\n collection_name=collection_name,\n token=token,\n api_endpoint=api_endpoint,\n namespace=namespace,\n metric=metric,\n batch_size=batch_size,\n bulk_insert_batch_concurrency=bulk_insert_batch_concurrency,\n bulk_insert_overwrite_concurrency=bulk_insert_overwrite_concurrency,\n bulk_delete_concurrency=bulk_delete_concurrency,\n setup_mode=setup_mode,\n pre_delete_collection=pre_delete_collection,\n metadata_indexing_include=metadata_indexing_include,\n metadata_indexing_exclude=metadata_indexing_exclude,\n collection_indexing_policy=collection_indexing_policy,\n )\n try:\n return self.search_with_vector_store(input_value, search_type, vector_store, k=number_of_results)\n except KeyError as e:\n if \"content\" in str(e):\n raise ValueError(\n \"You should ingest data through Langflow (or LangChain) to query it in Langflow. Your collection does not contain a field name 'content'.\"\n )\n else:\n raise e\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "collection_indexing_policy": {
+ "type": "dict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "collection_indexing_policy",
+ "display_name": "Collection Indexing Policy",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional dictionary defining the indexing policy for the collection.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "collection_name": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "collection_name",
+ "display_name": "Collection Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The name of the collection within Astra DB where the vectors will be stored.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "langflow"
+ },
+ "metadata_indexing_exclude": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "metadata_indexing_exclude",
+ "display_name": "Metadata Indexing Exclude",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional list of metadata fields to exclude from the indexing.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "metadata_indexing_include": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "metadata_indexing_include",
+ "display_name": "Metadata Indexing Include",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional list of metadata fields to include in the indexing.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "metric": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "metric",
+ "display_name": "Metric",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional distance metric for vector comparisons in the vector store.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "namespace": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "namespace",
+ "display_name": "Namespace",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional namespace within Astra DB to use for the collection.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "number_of_results": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 4,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "number_of_results",
+ "display_name": "Number of Results",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Number of results to return.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "pre_delete_collection": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "pre_delete_collection",
+ "display_name": "Pre Delete Collection",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Boolean flag to determine whether to delete the collection before creating a new one.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "search_type": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "Similarity",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["Similarity", "MMR"],
+ "name": "search_type",
+ "display_name": "Search Type",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "setup_mode": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "Sync",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["Sync", "Async", "Off"],
+ "name": "setup_mode",
+ "display_name": "Setup Mode",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Configuration mode for setting up the vector store, with options like \u201cSync\u201d, \u201cAsync\u201d, or \u201cOff\u201d.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "token": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "token",
+ "display_name": "Token",
+ "advanced": false,
+ "dynamic": false,
+ "info": "Authentication token for accessing Astra DB.",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "ASTRA_DB_APPLICATION_TOKEN"
+ },
+ "_type": "CustomComponent"
},
- {
- "source": "RecursiveCharacterTextSplitter-tR9QM",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153RecursiveCharacterTextSplitter\u0153,\u0153id\u0153:\u0153RecursiveCharacterTextSplitter-tR9QM\u0153}",
- "target": "AstraDB-eUCSS",
- "targetHandle": "{\u0153fieldName\u0153:\u0153inputs\u0153,\u0153id\u0153:\u0153AstraDB-eUCSS\u0153,\u0153inputTypes\u0153:null,\u0153type\u0153:\u0153Record\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "inputs",
- "id": "AstraDB-eUCSS",
- "inputTypes": null,
- "type": "Record"
- },
- "sourceHandle": {
- "baseClasses": [
- "Record"
- ],
- "dataType": "RecursiveCharacterTextSplitter",
- "id": "RecursiveCharacterTextSplitter-tR9QM"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-RecursiveCharacterTextSplitter-tR9QM{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153RecursiveCharacterTextSplitter\u0153,\u0153id\u0153:\u0153RecursiveCharacterTextSplitter-tR9QM\u0153}-AstraDB-eUCSS{\u0153fieldName\u0153:\u0153inputs\u0153,\u0153id\u0153:\u0153AstraDB-eUCSS\u0153,\u0153inputTypes\u0153:null,\u0153type\u0153:\u0153Record\u0153}",
- "selected": false
+ "description": "Searches an existing Astra DB Vector Store.",
+ "icon": "AstraDB",
+ "base_classes": ["Record"],
+ "display_name": "Astra DB Search",
+ "documentation": "",
+ "custom_fields": {
+ "embedding": null,
+ "collection_name": null,
+ "input_value": null,
+ "token": null,
+ "api_endpoint": null,
+ "search_type": null,
+ "number_of_results": null,
+ "namespace": null,
+ "metric": null,
+ "batch_size": null,
+ "bulk_insert_batch_concurrency": null,
+ "bulk_insert_overwrite_concurrency": null,
+ "bulk_delete_concurrency": null,
+ "setup_mode": null,
+ "pre_delete_collection": null,
+ "metadata_indexing_include": null,
+ "metadata_indexing_exclude": null,
+ "collection_indexing_policy": null
},
- {
- "source": "OpenAIEmbeddings-9TPjc",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Embeddings\u0153],\u0153dataType\u0153:\u0153OpenAIEmbeddings\u0153,\u0153id\u0153:\u0153OpenAIEmbeddings-9TPjc\u0153}",
- "target": "AstraDB-eUCSS",
- "targetHandle": "{\u0153fieldName\u0153:\u0153embedding\u0153,\u0153id\u0153:\u0153AstraDB-eUCSS\u0153,\u0153inputTypes\u0153:null,\u0153type\u0153:\u0153Embeddings\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "embedding",
- "id": "AstraDB-eUCSS",
- "inputTypes": null,
- "type": "Embeddings"
- },
- "sourceHandle": {
- "baseClasses": [
- "Embeddings"
- ],
- "dataType": "OpenAIEmbeddings",
- "id": "OpenAIEmbeddings-9TPjc"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-OpenAIEmbeddings-9TPjc{\u0153baseClasses\u0153:[\u0153Embeddings\u0153],\u0153dataType\u0153:\u0153OpenAIEmbeddings\u0153,\u0153id\u0153:\u0153OpenAIEmbeddings-9TPjc\u0153}-AstraDB-eUCSS{\u0153fieldName\u0153:\u0153embedding\u0153,\u0153id\u0153:\u0153AstraDB-eUCSS\u0153,\u0153inputTypes\u0153:null,\u0153type\u0153:\u0153Embeddings\u0153}",
- "selected": false
- },
- {
- "source": "AstraDBSearch-41nRz",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153AstraDBSearch\u0153,\u0153id\u0153:\u0153AstraDBSearch-41nRz\u0153}",
- "target": "TextOutput-BDknO",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153TextOutput-BDknO\u0153,\u0153inputTypes\u0153:[\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "TextOutput-BDknO",
- "inputTypes": [
- "Record",
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "Record"
- ],
- "dataType": "AstraDBSearch",
- "id": "AstraDBSearch-41nRz"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-AstraDBSearch-41nRz{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153AstraDBSearch\u0153,\u0153id\u0153:\u0153AstraDBSearch-41nRz\u0153}-TextOutput-BDknO{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153TextOutput-BDknO\u0153,\u0153inputTypes\u0153:[\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
- }
- ],
- "viewport": {
- "x": -259.6782520315529,
- "y": 90.3428735006047,
- "zoom": 0.2687057134854984
+ "output_types": ["Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [
+ "token",
+ "api_endpoint",
+ "collection_name",
+ "input_value",
+ "embedding"
+ ],
+ "beta": false
+ },
+ "id": "AstraDBSearch-41nRz"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 713,
+ "dragging": false,
+ "positionAbsolute": {
+ "x": 1723.976434815103,
+ "y": 277.03317407245913
}
- },
- "description": "Visit https://pre-release.langflow.org/tutorials/rag-with-astradb for a detailed guide of this project.\nThis project give you both Ingestion and RAG in a single file. You'll need to visit https://astra.datastax.com/ to create an Astra DB instance, your Token and grab an API Endpoint.\nRunning this project requires you to add a file in the Files component, then define a Collection Name and click on the Play icon on the Astra DB component. \n\nAfter the ingestion ends you are ready to click on the Run button at the lower left corner and start asking questions about your data.",
- "name": "Vector Store RAG",
- "last_tested_version": "1.0.0a0",
- "is_component": false
+ },
+ {
+ "id": "AstraDB-eUCSS",
+ "type": "genericNode",
+ "position": {
+ "x": 3372.04958055989,
+ "y": 1611.0742035495277
+ },
+ "data": {
+ "type": "AstraDB",
+ "node": {
+ "template": {
+ "embedding": {
+ "type": "Embeddings",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "embedding",
+ "display_name": "Embedding",
+ "advanced": false,
+ "dynamic": false,
+ "info": "Embedding to use",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "inputs": {
+ "type": "Record",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "inputs",
+ "display_name": "Inputs",
+ "advanced": false,
+ "dynamic": false,
+ "info": "Optional list of records to be processed and stored in the vector store.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "api_endpoint": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "api_endpoint",
+ "display_name": "API Endpoint",
+ "advanced": false,
+ "dynamic": false,
+ "info": "API endpoint URL for the Astra DB service.",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "ASTRA_DB_API_ENDPOINT"
+ },
+ "batch_size": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "batch_size",
+ "display_name": "Batch Size",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional number of records to process in a single batch.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "bulk_delete_concurrency": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "bulk_delete_concurrency",
+ "display_name": "Bulk Delete Concurrency",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional concurrency level for bulk delete operations.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "bulk_insert_batch_concurrency": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "bulk_insert_batch_concurrency",
+ "display_name": "Bulk Insert Batch Concurrency",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional concurrency level for bulk insert operations.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "bulk_insert_overwrite_concurrency": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "bulk_insert_overwrite_concurrency",
+ "display_name": "Bulk Insert Overwrite Concurrency",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional concurrency level for bulk insert operations that overwrite existing records.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import List, Optional\n\nfrom langchain_astradb import AstraDBVectorStore\nfrom langchain_astradb.utils.astradb import SetupMode\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Embeddings, VectorStore\nfrom langflow.schema import Record\n\n\nclass AstraDBVectorStoreComponent(CustomComponent):\n display_name = \"Astra DB\"\n description = \"Builds or loads an Astra DB Vector Store.\"\n icon = \"AstraDB\"\n field_order = [\"token\", \"api_endpoint\", \"collection_name\", \"inputs\", \"embedding\"]\n\n def build_config(self):\n return {\n \"inputs\": {\n \"display_name\": \"Inputs\",\n \"info\": \"Optional list of records to be processed and stored in the vector store.\",\n },\n \"embedding\": {\"display_name\": \"Embedding\", \"info\": \"Embedding to use\"},\n \"collection_name\": {\n \"display_name\": \"Collection Name\",\n \"info\": \"The name of the collection within Astra DB where the vectors will be stored.\",\n },\n \"token\": {\n \"display_name\": \"Token\",\n \"info\": \"Authentication token for accessing Astra DB.\",\n \"password\": True,\n },\n \"api_endpoint\": {\n \"display_name\": \"API Endpoint\",\n \"info\": \"API endpoint URL for the Astra DB service.\",\n },\n \"namespace\": {\n \"display_name\": \"Namespace\",\n \"info\": \"Optional namespace within Astra DB to use for the collection.\",\n \"advanced\": True,\n },\n \"metric\": {\n \"display_name\": \"Metric\",\n \"info\": \"Optional distance metric for vector comparisons in the vector store.\",\n \"advanced\": True,\n },\n \"batch_size\": {\n \"display_name\": \"Batch Size\",\n \"info\": \"Optional number of records to process in a single batch.\",\n \"advanced\": True,\n },\n \"bulk_insert_batch_concurrency\": {\n \"display_name\": \"Bulk Insert Batch Concurrency\",\n \"info\": \"Optional concurrency level for bulk insert operations.\",\n \"advanced\": True,\n },\n \"bulk_insert_overwrite_concurrency\": {\n \"display_name\": \"Bulk Insert Overwrite Concurrency\",\n \"info\": \"Optional concurrency level for bulk insert operations that overwrite existing records.\",\n \"advanced\": True,\n },\n \"bulk_delete_concurrency\": {\n \"display_name\": \"Bulk Delete Concurrency\",\n \"info\": \"Optional concurrency level for bulk delete operations.\",\n \"advanced\": True,\n },\n \"setup_mode\": {\n \"display_name\": \"Setup Mode\",\n \"info\": \"Configuration mode for setting up the vector store, with options like \u201cSync\u201d, \u201cAsync\u201d, or \u201cOff\u201d.\",\n \"options\": [\"Sync\", \"Async\", \"Off\"],\n \"advanced\": True,\n },\n \"pre_delete_collection\": {\n \"display_name\": \"Pre Delete Collection\",\n \"info\": \"Boolean flag to determine whether to delete the collection before creating a new one.\",\n \"advanced\": True,\n },\n \"metadata_indexing_include\": {\n \"display_name\": \"Metadata Indexing Include\",\n \"info\": \"Optional list of metadata fields to include in the indexing.\",\n \"advanced\": True,\n },\n \"metadata_indexing_exclude\": {\n \"display_name\": \"Metadata Indexing Exclude\",\n \"info\": \"Optional list of metadata fields to exclude from the indexing.\",\n \"advanced\": True,\n },\n \"collection_indexing_policy\": {\n \"display_name\": \"Collection Indexing Policy\",\n \"info\": \"Optional dictionary defining the indexing policy for the collection.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n embedding: Embeddings,\n token: str,\n api_endpoint: str,\n collection_name: str,\n inputs: Optional[List[Record]] = None,\n namespace: Optional[str] = None,\n metric: Optional[str] = None,\n batch_size: Optional[int] = None,\n bulk_insert_batch_concurrency: Optional[int] = None,\n bulk_insert_overwrite_concurrency: Optional[int] = None,\n bulk_delete_concurrency: Optional[int] = None,\n setup_mode: str = \"Async\",\n pre_delete_collection: bool = False,\n metadata_indexing_include: Optional[List[str]] = None,\n metadata_indexing_exclude: Optional[List[str]] = None,\n collection_indexing_policy: Optional[dict] = None,\n ) -> VectorStore:\n try:\n setup_mode_value = SetupMode[setup_mode.upper()]\n except KeyError:\n raise ValueError(f\"Invalid setup mode: {setup_mode}\")\n if inputs:\n documents = [_input.to_lc_document() for _input in inputs]\n\n vector_store = AstraDBVectorStore.from_documents(\n documents=documents,\n embedding=embedding,\n collection_name=collection_name,\n token=token,\n api_endpoint=api_endpoint,\n namespace=namespace,\n metric=metric,\n batch_size=batch_size,\n bulk_insert_batch_concurrency=bulk_insert_batch_concurrency,\n bulk_insert_overwrite_concurrency=bulk_insert_overwrite_concurrency,\n bulk_delete_concurrency=bulk_delete_concurrency,\n setup_mode=setup_mode_value,\n pre_delete_collection=pre_delete_collection,\n metadata_indexing_include=metadata_indexing_include,\n metadata_indexing_exclude=metadata_indexing_exclude,\n collection_indexing_policy=collection_indexing_policy,\n )\n else:\n vector_store = AstraDBVectorStore(\n embedding=embedding,\n collection_name=collection_name,\n token=token,\n api_endpoint=api_endpoint,\n namespace=namespace,\n metric=metric,\n batch_size=batch_size,\n bulk_insert_batch_concurrency=bulk_insert_batch_concurrency,\n bulk_insert_overwrite_concurrency=bulk_insert_overwrite_concurrency,\n bulk_delete_concurrency=bulk_delete_concurrency,\n setup_mode=setup_mode_value,\n pre_delete_collection=pre_delete_collection,\n metadata_indexing_include=metadata_indexing_include,\n metadata_indexing_exclude=metadata_indexing_exclude,\n collection_indexing_policy=collection_indexing_policy,\n )\n\n return vector_store\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "collection_indexing_policy": {
+ "type": "dict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "collection_indexing_policy",
+ "display_name": "Collection Indexing Policy",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional dictionary defining the indexing policy for the collection.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "collection_name": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "collection_name",
+ "display_name": "Collection Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The name of the collection within Astra DB where the vectors will be stored.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "langflow"
+ },
+ "metadata_indexing_exclude": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "metadata_indexing_exclude",
+ "display_name": "Metadata Indexing Exclude",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional list of metadata fields to exclude from the indexing.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "metadata_indexing_include": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "metadata_indexing_include",
+ "display_name": "Metadata Indexing Include",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional list of metadata fields to include in the indexing.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "metric": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "metric",
+ "display_name": "Metric",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional distance metric for vector comparisons in the vector store.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "namespace": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "namespace",
+ "display_name": "Namespace",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional namespace within Astra DB to use for the collection.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "pre_delete_collection": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "pre_delete_collection",
+ "display_name": "Pre Delete Collection",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Boolean flag to determine whether to delete the collection before creating a new one.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "setup_mode": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "Async",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["Sync", "Async", "Off"],
+ "name": "setup_mode",
+ "display_name": "Setup Mode",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Configuration mode for setting up the vector store, with options like \u201cSync\u201d, \u201cAsync\u201d, or \u201cOff\u201d.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "token": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "token",
+ "display_name": "Token",
+ "advanced": false,
+ "dynamic": false,
+ "info": "Authentication token for accessing Astra DB.",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "ASTRA_DB_APPLICATION_TOKEN"
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Builds or loads an Astra DB Vector Store.",
+ "icon": "AstraDB",
+ "base_classes": ["VectorStore"],
+ "display_name": "Astra DB",
+ "documentation": "",
+ "custom_fields": {
+ "embedding": null,
+ "token": null,
+ "api_endpoint": null,
+ "collection_name": null,
+ "inputs": null,
+ "namespace": null,
+ "metric": null,
+ "batch_size": null,
+ "bulk_insert_batch_concurrency": null,
+ "bulk_insert_overwrite_concurrency": null,
+ "bulk_delete_concurrency": null,
+ "setup_mode": null,
+ "pre_delete_collection": null,
+ "metadata_indexing_include": null,
+ "metadata_indexing_exclude": null,
+ "collection_indexing_policy": null
+ },
+ "output_types": ["VectorStore"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [
+ "token",
+ "api_endpoint",
+ "collection_name",
+ "inputs",
+ "embedding"
+ ],
+ "beta": false
+ },
+ "id": "AstraDB-eUCSS"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 573,
+ "positionAbsolute": {
+ "x": 3372.04958055989,
+ "y": 1611.0742035495277
+ },
+ "dragging": false
+ },
+ {
+ "id": "OpenAIEmbeddings-9TPjc",
+ "type": "genericNode",
+ "position": {
+ "x": 2814.0402191223047,
+ "y": 1955.9268168273086
+ },
+ "data": {
+ "type": "OpenAIEmbeddings",
+ "node": {
+ "template": {
+ "allowed_special": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": [],
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "allowed_special",
+ "display_name": "Allowed Special",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "chunk_size": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 1000,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "chunk_size",
+ "display_name": "Chunk Size",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "client": {
+ "type": "Any",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "client",
+ "display_name": "Client",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Any, Dict, List, Optional\n\nfrom langchain_openai.embeddings.base import OpenAIEmbeddings\n\nfrom langflow.field_typing import Embeddings, NestedDict\nfrom langflow.interface.custom.custom_component import CustomComponent\n\n\nclass OpenAIEmbeddingsComponent(CustomComponent):\n display_name = \"OpenAI Embeddings\"\n description = \"Generate embeddings using OpenAI models.\"\n\n def build_config(self):\n return {\n \"allowed_special\": {\n \"display_name\": \"Allowed Special\",\n \"advanced\": True,\n \"field_type\": \"str\",\n \"is_list\": True,\n },\n \"default_headers\": {\n \"display_name\": \"Default Headers\",\n \"advanced\": True,\n \"field_type\": \"dict\",\n },\n \"default_query\": {\n \"display_name\": \"Default Query\",\n \"advanced\": True,\n \"field_type\": \"NestedDict\",\n },\n \"disallowed_special\": {\n \"display_name\": \"Disallowed Special\",\n \"advanced\": True,\n \"field_type\": \"str\",\n \"is_list\": True,\n },\n \"chunk_size\": {\"display_name\": \"Chunk Size\", \"advanced\": True},\n \"client\": {\"display_name\": \"Client\", \"advanced\": True},\n \"deployment\": {\"display_name\": \"Deployment\", \"advanced\": True},\n \"embedding_ctx_length\": {\n \"display_name\": \"Embedding Context Length\",\n \"advanced\": True,\n },\n \"max_retries\": {\"display_name\": \"Max Retries\", \"advanced\": True},\n \"model\": {\n \"display_name\": \"Model\",\n \"advanced\": False,\n \"options\": [\n \"text-embedding-3-small\",\n \"text-embedding-3-large\",\n \"text-embedding-ada-002\",\n ],\n },\n \"model_kwargs\": {\"display_name\": \"Model Kwargs\", \"advanced\": True},\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"password\": True,\n \"advanced\": True,\n },\n \"openai_api_key\": {\"display_name\": \"OpenAI API Key\", \"password\": True},\n \"openai_api_type\": {\n \"display_name\": \"OpenAI API Type\",\n \"advanced\": True,\n \"password\": True,\n },\n \"openai_api_version\": {\n \"display_name\": \"OpenAI API Version\",\n \"advanced\": True,\n },\n \"openai_organization\": {\n \"display_name\": \"OpenAI Organization\",\n \"advanced\": True,\n },\n \"openai_proxy\": {\"display_name\": \"OpenAI Proxy\", \"advanced\": True},\n \"request_timeout\": {\"display_name\": \"Request Timeout\", \"advanced\": True},\n \"show_progress_bar\": {\n \"display_name\": \"Show Progress Bar\",\n \"advanced\": True,\n },\n \"skip_empty\": {\"display_name\": \"Skip Empty\", \"advanced\": True},\n \"tiktoken_model_name\": {\n \"display_name\": \"TikToken Model Name\",\n \"advanced\": True,\n },\n \"tiktoken_enable\": {\"display_name\": \"TikToken Enable\", \"advanced\": True},\n }\n\n def build(\n self,\n openai_api_key: str,\n default_headers: Optional[Dict[str, str]] = None,\n default_query: Optional[NestedDict] = {},\n allowed_special: List[str] = [],\n disallowed_special: List[str] = [\"all\"],\n chunk_size: int = 1000,\n client: Optional[Any] = None,\n deployment: str = \"text-embedding-ada-002\",\n embedding_ctx_length: int = 8191,\n max_retries: int = 6,\n model: str = \"text-embedding-ada-002\",\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n openai_api_type: Optional[str] = None,\n openai_api_version: Optional[str] = None,\n openai_organization: Optional[str] = None,\n openai_proxy: Optional[str] = None,\n request_timeout: Optional[float] = None,\n show_progress_bar: bool = False,\n skip_empty: bool = False,\n tiktoken_enable: bool = True,\n tiktoken_model_name: Optional[str] = None,\n ) -> Embeddings:\n # This is to avoid errors with Vector Stores (e.g Chroma)\n if disallowed_special == [\"all\"]:\n disallowed_special = \"all\" # type: ignore\n\n return OpenAIEmbeddings(\n tiktoken_enabled=tiktoken_enable,\n default_headers=default_headers,\n default_query=default_query,\n allowed_special=set(allowed_special),\n disallowed_special=\"all\",\n chunk_size=chunk_size,\n client=client,\n deployment=deployment,\n embedding_ctx_length=embedding_ctx_length,\n max_retries=max_retries,\n model=model,\n model_kwargs=model_kwargs,\n base_url=openai_api_base,\n api_key=openai_api_key,\n openai_api_type=openai_api_type,\n api_version=openai_api_version,\n organization=openai_organization,\n openai_proxy=openai_proxy,\n timeout=request_timeout,\n show_progress_bar=show_progress_bar,\n skip_empty=skip_empty,\n tiktoken_model_name=tiktoken_model_name,\n )\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "default_headers": {
+ "type": "dict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "default_headers",
+ "display_name": "Default Headers",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "default_query": {
+ "type": "NestedDict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": {},
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "default_query",
+ "display_name": "Default Query",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "deployment": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "text-embedding-ada-002",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "deployment",
+ "display_name": "Deployment",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "disallowed_special": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": ["all"],
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "disallowed_special",
+ "display_name": "Disallowed Special",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "embedding_ctx_length": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 8191,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "embedding_ctx_length",
+ "display_name": "Embedding Context Length",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "max_retries": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 6,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "max_retries",
+ "display_name": "Max Retries",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "model": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "text-embedding-ada-002",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": [
+ "text-embedding-3-small",
+ "text-embedding-3-large",
+ "text-embedding-ada-002"
+ ],
+ "name": "model",
+ "display_name": "Model",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "model_kwargs": {
+ "type": "NestedDict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": {},
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "model_kwargs",
+ "display_name": "Model Kwargs",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "openai_api_base": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "openai_api_base",
+ "display_name": "OpenAI API Base",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_key": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "openai_api_key",
+ "display_name": "OpenAI API Key",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": ""
+ },
+ "openai_api_type": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "openai_api_type",
+ "display_name": "OpenAI API Type",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_version": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "openai_api_version",
+ "display_name": "OpenAI API Version",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_organization": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "openai_organization",
+ "display_name": "OpenAI Organization",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_proxy": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "openai_proxy",
+ "display_name": "OpenAI Proxy",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "request_timeout": {
+ "type": "float",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "request_timeout",
+ "display_name": "Request Timeout",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "rangeSpec": {
+ "step_type": "float",
+ "min": -1,
+ "max": 1,
+ "step": 0.1
+ },
+ "load_from_db": false,
+ "title_case": false
+ },
+ "show_progress_bar": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "show_progress_bar",
+ "display_name": "Show Progress Bar",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "skip_empty": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "skip_empty",
+ "display_name": "Skip Empty",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "tiktoken_enable": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "tiktoken_enable",
+ "display_name": "TikToken Enable",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "tiktoken_model_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "tiktoken_model_name",
+ "display_name": "TikToken Model Name",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Generate embeddings using OpenAI models.",
+ "base_classes": ["Embeddings"],
+ "display_name": "OpenAI Embeddings",
+ "documentation": "",
+ "custom_fields": {
+ "openai_api_key": null,
+ "default_headers": null,
+ "default_query": null,
+ "allowed_special": null,
+ "disallowed_special": null,
+ "chunk_size": null,
+ "client": null,
+ "deployment": null,
+ "embedding_ctx_length": null,
+ "max_retries": null,
+ "model": null,
+ "model_kwargs": null,
+ "openai_api_base": null,
+ "openai_api_type": null,
+ "openai_api_version": null,
+ "openai_organization": null,
+ "openai_proxy": null,
+ "request_timeout": null,
+ "show_progress_bar": null,
+ "skip_empty": null,
+ "tiktoken_enable": null,
+ "tiktoken_model_name": null
+ },
+ "output_types": ["Embeddings"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "OpenAIEmbeddings-9TPjc"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 383,
+ "positionAbsolute": {
+ "x": 2814.0402191223047,
+ "y": 1955.9268168273086
+ },
+ "dragging": false
+ }
+ ],
+ "edges": [
+ {
+ "source": "TextOutput-BDknO",
+ "target": "Prompt-xeI6K",
+ "sourceHandle": "{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153TextOutput\u0153,\u0153id\u0153:\u0153TextOutput-BDknO\u0153}",
+ "targetHandle": "{\u0153fieldName\u0153:\u0153context\u0153,\u0153id\u0153:\u0153Prompt-xeI6K\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
+ "id": "reactflow__edge-TextOutput-BDknO{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153TextOutput\u0153,\u0153id\u0153:\u0153TextOutput-BDknO\u0153}-Prompt-xeI6K{\u0153fieldName\u0153:\u0153context\u0153,\u0153id\u0153:\u0153Prompt-xeI6K\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "context",
+ "id": "Prompt-xeI6K",
+ "inputTypes": ["Document", "BaseOutputParser", "Record", "Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["object", "Text", "str"],
+ "dataType": "TextOutput",
+ "id": "TextOutput-BDknO"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "selected": false
+ },
+ {
+ "source": "ChatInput-yxMKE",
+ "target": "Prompt-xeI6K",
+ "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Text\u0153,\u0153str\u0153,\u0153object\u0153,\u0153Record\u0153],\u0153dataType\u0153:\u0153ChatInput\u0153,\u0153id\u0153:\u0153ChatInput-yxMKE\u0153}",
+ "targetHandle": "{\u0153fieldName\u0153:\u0153question\u0153,\u0153id\u0153:\u0153Prompt-xeI6K\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
+ "id": "reactflow__edge-ChatInput-yxMKE{\u0153baseClasses\u0153:[\u0153Text\u0153,\u0153str\u0153,\u0153object\u0153,\u0153Record\u0153],\u0153dataType\u0153:\u0153ChatInput\u0153,\u0153id\u0153:\u0153ChatInput-yxMKE\u0153}-Prompt-xeI6K{\u0153fieldName\u0153:\u0153question\u0153,\u0153id\u0153:\u0153Prompt-xeI6K\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "question",
+ "id": "Prompt-xeI6K",
+ "inputTypes": ["Document", "BaseOutputParser", "Record", "Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["Text", "str", "object", "Record"],
+ "dataType": "ChatInput",
+ "id": "ChatInput-yxMKE"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "selected": false
+ },
+ {
+ "source": "Prompt-xeI6K",
+ "target": "OpenAIModel-EjXlN",
+ "sourceHandle": "{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-xeI6K\u0153}",
+ "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153OpenAIModel-EjXlN\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
+ "id": "reactflow__edge-Prompt-xeI6K{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-xeI6K\u0153}-OpenAIModel-EjXlN{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153OpenAIModel-EjXlN\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "OpenAIModel-EjXlN",
+ "inputTypes": ["Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["object", "Text", "str"],
+ "dataType": "Prompt",
+ "id": "Prompt-xeI6K"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "selected": false
+ },
+ {
+ "source": "OpenAIModel-EjXlN",
+ "target": "ChatOutput-Q39I8",
+ "sourceHandle": "{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153OpenAIModel\u0153,\u0153id\u0153:\u0153OpenAIModel-EjXlN\u0153}",
+ "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153ChatOutput-Q39I8\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
+ "id": "reactflow__edge-OpenAIModel-EjXlN{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153OpenAIModel\u0153,\u0153id\u0153:\u0153OpenAIModel-EjXlN\u0153}-ChatOutput-Q39I8{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153ChatOutput-Q39I8\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "ChatOutput-Q39I8",
+ "inputTypes": ["Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["object", "Text", "str"],
+ "dataType": "OpenAIModel",
+ "id": "OpenAIModel-EjXlN"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "selected": false
+ },
+ {
+ "source": "File-t0a6a",
+ "target": "RecursiveCharacterTextSplitter-tR9QM",
+ "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153File\u0153,\u0153id\u0153:\u0153File-t0a6a\u0153}",
+ "targetHandle": "{\u0153fieldName\u0153:\u0153inputs\u0153,\u0153id\u0153:\u0153RecursiveCharacterTextSplitter-tR9QM\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153Record\u0153],\u0153type\u0153:\u0153Document\u0153}",
+ "id": "reactflow__edge-File-t0a6a{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153File\u0153,\u0153id\u0153:\u0153File-t0a6a\u0153}-RecursiveCharacterTextSplitter-tR9QM{\u0153fieldName\u0153:\u0153inputs\u0153,\u0153id\u0153:\u0153RecursiveCharacterTextSplitter-tR9QM\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153Record\u0153],\u0153type\u0153:\u0153Document\u0153}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "inputs",
+ "id": "RecursiveCharacterTextSplitter-tR9QM",
+ "inputTypes": ["Document", "Record"],
+ "type": "Document"
+ },
+ "sourceHandle": {
+ "baseClasses": ["Record"],
+ "dataType": "File",
+ "id": "File-t0a6a"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "selected": false
+ },
+ {
+ "source": "OpenAIEmbeddings-ZlOk1",
+ "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Embeddings\u0153],\u0153dataType\u0153:\u0153OpenAIEmbeddings\u0153,\u0153id\u0153:\u0153OpenAIEmbeddings-ZlOk1\u0153}",
+ "target": "AstraDBSearch-41nRz",
+ "targetHandle": "{\u0153fieldName\u0153:\u0153embedding\u0153,\u0153id\u0153:\u0153AstraDBSearch-41nRz\u0153,\u0153inputTypes\u0153:null,\u0153type\u0153:\u0153Embeddings\u0153}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "embedding",
+ "id": "AstraDBSearch-41nRz",
+ "inputTypes": null,
+ "type": "Embeddings"
+ },
+ "sourceHandle": {
+ "baseClasses": ["Embeddings"],
+ "dataType": "OpenAIEmbeddings",
+ "id": "OpenAIEmbeddings-ZlOk1"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-OpenAIEmbeddings-ZlOk1{\u0153baseClasses\u0153:[\u0153Embeddings\u0153],\u0153dataType\u0153:\u0153OpenAIEmbeddings\u0153,\u0153id\u0153:\u0153OpenAIEmbeddings-ZlOk1\u0153}-AstraDBSearch-41nRz{\u0153fieldName\u0153:\u0153embedding\u0153,\u0153id\u0153:\u0153AstraDBSearch-41nRz\u0153,\u0153inputTypes\u0153:null,\u0153type\u0153:\u0153Embeddings\u0153}"
+ },
+ {
+ "source": "ChatInput-yxMKE",
+ "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Text\u0153,\u0153str\u0153,\u0153object\u0153,\u0153Record\u0153],\u0153dataType\u0153:\u0153ChatInput\u0153,\u0153id\u0153:\u0153ChatInput-yxMKE\u0153}",
+ "target": "AstraDBSearch-41nRz",
+ "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153AstraDBSearch-41nRz\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "AstraDBSearch-41nRz",
+ "inputTypes": ["Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["Text", "str", "object", "Record"],
+ "dataType": "ChatInput",
+ "id": "ChatInput-yxMKE"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-ChatInput-yxMKE{\u0153baseClasses\u0153:[\u0153Text\u0153,\u0153str\u0153,\u0153object\u0153,\u0153Record\u0153],\u0153dataType\u0153:\u0153ChatInput\u0153,\u0153id\u0153:\u0153ChatInput-yxMKE\u0153}-AstraDBSearch-41nRz{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153AstraDBSearch-41nRz\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
+ },
+ {
+ "source": "RecursiveCharacterTextSplitter-tR9QM",
+ "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153RecursiveCharacterTextSplitter\u0153,\u0153id\u0153:\u0153RecursiveCharacterTextSplitter-tR9QM\u0153}",
+ "target": "AstraDB-eUCSS",
+ "targetHandle": "{\u0153fieldName\u0153:\u0153inputs\u0153,\u0153id\u0153:\u0153AstraDB-eUCSS\u0153,\u0153inputTypes\u0153:null,\u0153type\u0153:\u0153Record\u0153}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "inputs",
+ "id": "AstraDB-eUCSS",
+ "inputTypes": null,
+ "type": "Record"
+ },
+ "sourceHandle": {
+ "baseClasses": ["Record"],
+ "dataType": "RecursiveCharacterTextSplitter",
+ "id": "RecursiveCharacterTextSplitter-tR9QM"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-RecursiveCharacterTextSplitter-tR9QM{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153RecursiveCharacterTextSplitter\u0153,\u0153id\u0153:\u0153RecursiveCharacterTextSplitter-tR9QM\u0153}-AstraDB-eUCSS{\u0153fieldName\u0153:\u0153inputs\u0153,\u0153id\u0153:\u0153AstraDB-eUCSS\u0153,\u0153inputTypes\u0153:null,\u0153type\u0153:\u0153Record\u0153}",
+ "selected": false
+ },
+ {
+ "source": "OpenAIEmbeddings-9TPjc",
+ "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Embeddings\u0153],\u0153dataType\u0153:\u0153OpenAIEmbeddings\u0153,\u0153id\u0153:\u0153OpenAIEmbeddings-9TPjc\u0153}",
+ "target": "AstraDB-eUCSS",
+ "targetHandle": "{\u0153fieldName\u0153:\u0153embedding\u0153,\u0153id\u0153:\u0153AstraDB-eUCSS\u0153,\u0153inputTypes\u0153:null,\u0153type\u0153:\u0153Embeddings\u0153}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "embedding",
+ "id": "AstraDB-eUCSS",
+ "inputTypes": null,
+ "type": "Embeddings"
+ },
+ "sourceHandle": {
+ "baseClasses": ["Embeddings"],
+ "dataType": "OpenAIEmbeddings",
+ "id": "OpenAIEmbeddings-9TPjc"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-OpenAIEmbeddings-9TPjc{\u0153baseClasses\u0153:[\u0153Embeddings\u0153],\u0153dataType\u0153:\u0153OpenAIEmbeddings\u0153,\u0153id\u0153:\u0153OpenAIEmbeddings-9TPjc\u0153}-AstraDB-eUCSS{\u0153fieldName\u0153:\u0153embedding\u0153,\u0153id\u0153:\u0153AstraDB-eUCSS\u0153,\u0153inputTypes\u0153:null,\u0153type\u0153:\u0153Embeddings\u0153}",
+ "selected": false
+ },
+ {
+ "source": "AstraDBSearch-41nRz",
+ "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153AstraDBSearch\u0153,\u0153id\u0153:\u0153AstraDBSearch-41nRz\u0153}",
+ "target": "TextOutput-BDknO",
+ "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153TextOutput-BDknO\u0153,\u0153inputTypes\u0153:[\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "TextOutput-BDknO",
+ "inputTypes": ["Record", "Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["Record"],
+ "dataType": "AstraDBSearch",
+ "id": "AstraDBSearch-41nRz"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-AstraDBSearch-41nRz{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153AstraDBSearch\u0153,\u0153id\u0153:\u0153AstraDBSearch-41nRz\u0153}-TextOutput-BDknO{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153TextOutput-BDknO\u0153,\u0153inputTypes\u0153:[\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
+ }
+ ],
+ "viewport": {
+ "x": -259.6782520315529,
+ "y": 90.3428735006047,
+ "zoom": 0.2687057134854984
+ }
+ },
+ "description": "Visit https://pre-release.langflow.org/tutorials/rag-with-astradb for a detailed guide of this project.\nThis project give you both Ingestion and RAG in a single file. You'll need to visit https://astra.datastax.com/ to create an Astra DB instance, your Token and grab an API Endpoint.\nRunning this project requires you to add a file in the Files component, then define a Collection Name and click on the Play icon on the Astra DB component. \n\nAfter the ingestion ends you are ready to click on the Run button at the lower left corner and start asking questions about your data.",
+ "name": "Vector Store RAG",
+ "last_tested_version": "1.0.0a0",
+ "is_component": false
}
diff --git a/docs/static/json_files/Notion_Components_bundle.json b/docs/static/json_files/Notion_Components_bundle.json
index 21181187c..5e632ad9c 100644
--- a/docs/static/json_files/Notion_Components_bundle.json
+++ b/docs/static/json_files/Notion_Components_bundle.json
@@ -1 +1,881 @@
-{"id":"7cd51434-9767-450f-8742-27857367f8c2","data":{"nodes":[{"id":"RecordsToText-Q69g5","type":"genericNode","position":{"x":-2671.5528488127866,"y":-963.4266471378126},"data":{"type":"RecordsToText","node":{"template":{"code":{"type":"code","required":true,"placeholder":"","list":false,"show":true,"multiline":true,"value":"import requests\r\nfrom typing import List\r\n\r\nfrom langflow import CustomComponent\r\nfrom langflow.schema import Record\r\n\r\n\r\nclass NotionUserList(CustomComponent):\r\n display_name = \"List Users [Notion]\"\r\n description = \"Retrieve users from Notion.\"\r\n documentation: str = \"https://docs.langflow.org/integrations/notion/list-users\"\r\n icon = \"NotionDirectoryLoader\"\r\n \r\n def build_config(self):\r\n return {\r\n \"notion_secret\": {\r\n \"display_name\": \"Notion Secret\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The Notion integration token.\",\r\n \"password\": True,\r\n },\r\n }\r\n\r\n def build(\r\n self,\r\n notion_secret: str,\r\n ) -> List[Record]:\r\n url = \"https://api.notion.com/v1/users\"\r\n headers = {\r\n \"Authorization\": f\"Bearer {notion_secret}\",\r\n \"Notion-Version\": \"2022-06-28\",\r\n }\r\n\r\n response = requests.get(url, headers=headers)\r\n response.raise_for_status()\r\n\r\n data = response.json()\r\n results = data['results']\r\n\r\n records = []\r\n for user in results:\r\n id = user['id']\r\n type = user['type']\r\n name = user.get('name', '')\r\n avatar_url = user.get('avatar_url', '')\r\n\r\n record_data = {\r\n \"id\": id,\r\n \"type\": type,\r\n \"name\": name,\r\n \"avatar_url\": avatar_url,\r\n }\r\n\r\n output = \"User:\\n\"\r\n for key, value in record_data.items():\r\n output += f\"{key.replace('_', ' ').title()}: {value}\\n\"\r\n output += \"________________________\\n\"\r\n\r\n record = Record(text=output, data=record_data)\r\n records.append(record)\r\n\r\n self.status = \"\\n\".join(record.text for record in records)\r\n return records","fileTypes":[],"file_path":"","password":false,"name":"code","advanced":true,"dynamic":true,"info":"","load_from_db":false,"title_case":false},"notion_secret":{"type":"str","required":true,"placeholder":"","list":false,"show":true,"multiline":false,"fileTypes":[],"file_path":"","password":true,"name":"notion_secret","display_name":"Notion Secret","advanced":false,"dynamic":false,"info":"The Notion integration token.","load_from_db":false,"title_case":false,"input_types":["Text"],"value":""},"_type":"CustomComponent"},"description":"Retrieve users from Notion.","icon":"NotionDirectoryLoader","base_classes":["Record"],"display_name":"List Users [Notion] ","documentation":"https://docs.langflow.org/integrations/notion/list-users","custom_fields":{"notion_secret":null},"output_types":["Record"],"field_formatters":{},"frozen":false,"field_order":[],"beta":false},"id":"RecordsToText-Q69g5","description":"Retrieve users from Notion.","display_name":"List Users [Notion] "},"selected":false,"width":384,"height":289,"dragging":false,"positionAbsolute":{"x":-2671.5528488127866,"y":-963.4266471378126}},{"id":"CustomComponent-PU0K5","type":"genericNode","position":{"x":-3077.2269116193215,"y":-960.9450220159636},"data":{"type":"CustomComponent","node":{"template":{"code":{"type":"code","required":true,"placeholder":"","list":false,"show":true,"multiline":true,"value":"import json\r\nfrom typing import Optional\r\n\r\nimport requests\r\nfrom langflow.custom import CustomComponent\r\n\r\n\r\nclass NotionPageCreator(CustomComponent):\r\n display_name = \"Create Page [Notion]\"\r\n description = \"A component for creating Notion pages.\"\r\n documentation: str = \"https://docs.langflow.org/integrations/notion/page-create\"\r\n icon = \"NotionDirectoryLoader\"\r\n\r\n def build_config(self):\r\n return {\r\n \"database_id\": {\r\n \"display_name\": \"Database ID\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The ID of the Notion database.\",\r\n },\r\n \"notion_secret\": {\r\n \"display_name\": \"Notion Secret\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The Notion integration token.\",\r\n \"password\": True,\r\n },\r\n \"properties\": {\r\n \"display_name\": \"Properties\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The properties of the new page. Depending on your database setup, this can change. E.G: {'Task name': {'id': 'title', 'type': 'title', 'title': [{'type': 'text', 'text': {'content': 'Send Notion Components to LF', 'link': null}}]}}\",\r\n },\r\n }\r\n\r\n def build(\r\n self,\r\n database_id: str,\r\n notion_secret: str,\r\n properties: str = '{\"Task name\": {\"id\": \"title\", \"type\": \"title\", \"title\": [{\"type\": \"text\", \"text\": {\"content\": \"Send Notion Components to LF\", \"link\": null}}]}}',\r\n ) -> str:\r\n if not database_id or not properties:\r\n raise ValueError(\"Invalid input. Please provide 'database_id' and 'properties'.\")\r\n\r\n headers = {\r\n \"Authorization\": f\"Bearer {notion_secret}\",\r\n \"Content-Type\": \"application/json\",\r\n \"Notion-Version\": \"2022-06-28\",\r\n }\r\n\r\n data = {\r\n \"parent\": {\"database_id\": database_id},\r\n \"properties\": json.loads(properties),\r\n }\r\n\r\n response = requests.post(\"https://api.notion.com/v1/pages\", headers=headers, json=data)\r\n\r\n if response.status_code == 200:\r\n page_id = response.json()[\"id\"]\r\n self.status = f\"Successfully created Notion page with ID: {page_id}\\n {str(response.json())}\"\r\n return response.json()\r\n else:\r\n error_message = f\"Failed to create Notion page. Status code: {response.status_code}, Error: {response.text}\"\r\n self.status = error_message\r\n raise Exception(error_message)","fileTypes":[],"file_path":"","password":false,"name":"code","advanced":true,"dynamic":true,"info":"","load_from_db":false,"title_case":false},"database_id":{"type":"str","required":true,"placeholder":"","list":false,"show":true,"multiline":false,"fileTypes":[],"file_path":"","password":false,"name":"database_id","display_name":"Database ID","advanced":false,"dynamic":false,"info":"The ID of the Notion database.","load_from_db":false,"title_case":false,"input_types":["Text"]},"notion_secret":{"type":"str","required":true,"placeholder":"","list":false,"show":true,"multiline":false,"fileTypes":[],"file_path":"","password":true,"name":"notion_secret","display_name":"Notion Secret","advanced":false,"dynamic":false,"info":"The Notion integration token.","load_from_db":false,"title_case":false,"input_types":["Text"],"value":""},"properties":{"type":"str","required":false,"placeholder":"","list":false,"show":true,"multiline":false,"value":"{\"Task name\": {\"id\": \"title\", \"type\": \"title\", \"title\": [{\"type\": \"text\", \"text\": {\"content\": \"Send Notion Components to LF\", \"link\": null}}]}}","fileTypes":[],"file_path":"","password":false,"name":"properties","display_name":"Properties","advanced":false,"dynamic":false,"info":"The properties of the new page. Depending on your database setup, this can change. E.G: {'Task name': {'id': 'title', 'type': 'title', 'title': [{'type': 'text', 'text': {'content': 'Send Notion Components to LF', 'link': null}}]}}","load_from_db":false,"title_case":false,"input_types":["Text"]},"_type":"CustomComponent"},"description":"A component for creating Notion pages.","icon":"NotionDirectoryLoader","base_classes":["object","str","Text"],"display_name":"Create Page [Notion] ","documentation":"https://docs.langflow.org/integrations/notion/page-create","custom_fields":{"database_id":null,"notion_secret":null,"properties":null},"output_types":["Text"],"field_formatters":{},"frozen":false,"field_order":[],"beta":false},"id":"CustomComponent-PU0K5","description":"A component for creating Notion pages.","display_name":"Create Page [Notion] "},"selected":false,"width":384,"height":477,"positionAbsolute":{"x":-3077.2269116193215,"y":-960.9450220159636},"dragging":false},{"id":"CustomComponent-YODla","type":"genericNode","position":{"x":-3485.297183150799,"y":-362.8525892356713},"data":{"type":"CustomComponent","node":{"template":{"code":{"type":"code","required":true,"placeholder":"","list":false,"show":true,"multiline":true,"value":"import requests\r\nfrom typing import Dict\r\n\r\nfrom langflow import CustomComponent\r\nfrom langflow.schema import Record\r\n\r\n\r\nclass NotionDatabaseProperties(CustomComponent):\r\n display_name = \"List Database Properties [Notion]\"\r\n description = \"Retrieve properties of a Notion database.\"\r\n documentation: str = \"https://docs.langflow.org/integrations/notion/list-database-properties\"\r\n icon = \"NotionDirectoryLoader\"\r\n \r\n def build_config(self):\r\n return {\r\n \"database_id\": {\r\n \"display_name\": \"Database ID\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The ID of the Notion database.\",\r\n },\r\n \"notion_secret\": {\r\n \"display_name\": \"Notion Secret\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The Notion integration token.\",\r\n \"password\": True,\r\n },\r\n }\r\n\r\n def build(\r\n self,\r\n database_id: str,\r\n notion_secret: str,\r\n ) -> Record:\r\n url = f\"https://api.notion.com/v1/databases/{database_id}\"\r\n headers = {\r\n \"Authorization\": f\"Bearer {notion_secret}\",\r\n \"Notion-Version\": \"2022-06-28\", # Use the latest supported version\r\n }\r\n\r\n response = requests.get(url, headers=headers)\r\n response.raise_for_status()\r\n\r\n data = response.json()\r\n properties = data.get(\"properties\", {})\r\n\r\n record = Record(text=str(response.json()), data=properties)\r\n self.status = f\"Retrieved {len(properties)} properties from the Notion database.\\n {record.text}\"\r\n return record","fileTypes":[],"file_path":"","password":false,"name":"code","advanced":true,"dynamic":true,"info":"","load_from_db":false,"title_case":false},"database_id":{"type":"str","required":true,"placeholder":"","list":false,"show":true,"multiline":false,"fileTypes":[],"file_path":"","password":false,"name":"database_id","display_name":"Database ID","advanced":false,"dynamic":false,"info":"The ID of the Notion database.","load_from_db":true,"title_case":false,"input_types":["Text"],"value":"NOTION_NMSTX_DB_ID"},"notion_secret":{"type":"str","required":true,"placeholder":"","list":false,"show":true,"multiline":false,"fileTypes":[],"file_path":"","password":true,"name":"notion_secret","display_name":"Notion Secret","advanced":false,"dynamic":false,"info":"The Notion integration token.","load_from_db":true,"title_case":false,"input_types":["Text"],"value":""},"_type":"CustomComponent"},"description":"Retrieve properties of a Notion database.","icon":"NotionDirectoryLoader","base_classes":["Record"],"display_name":"List Database Properties [Notion] ","documentation":"https://docs.langflow.org/integrations/notion/list-database-properties","custom_fields":{"database_id":null,"notion_secret":null},"output_types":["Record"],"field_formatters":{},"frozen":false,"field_order":[],"beta":false},"id":"CustomComponent-YODla","description":"Retrieve properties of a Notion database.","display_name":"List Database Properties [Notion] "},"selected":true,"width":384,"height":383,"dragging":false,"positionAbsolute":{"x":-3485.297183150799,"y":-362.8525892356713}},{"id":"CustomComponent-wHlSz","type":"genericNode","position":{"x":-2668.7714642455403,"y":-657.2376228212606},"data":{"type":"CustomComponent","node":{"template":{"code":{"type":"code","required":true,"placeholder":"","list":false,"show":true,"multiline":true,"value":"import json\r\nimport requests\r\nfrom typing import Dict, Any\r\n\r\nfrom langflow import CustomComponent\r\nfrom langflow.schema import Record\r\n\r\n\r\nclass NotionPageUpdate(CustomComponent):\r\n display_name = \"Update Page Property [Notion]\"\r\n description = \"Update the properties of a Notion page.\"\r\n documentation: str = \"https://docs.langflow.org/integrations/notion/page-update\"\r\n icon = \"NotionDirectoryLoader\"\r\n\r\n def build_config(self):\r\n return {\r\n \"page_id\": {\r\n \"display_name\": \"Page ID\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The ID of the Notion page to update.\",\r\n },\r\n \"properties\": {\r\n \"display_name\": \"Properties\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The properties to update on the page (as a JSON string).\",\r\n \"multiline\": True,\r\n },\r\n \"notion_secret\": {\r\n \"display_name\": \"Notion Secret\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The Notion integration token.\",\r\n \"password\": True,\r\n },\r\n }\r\n\r\n def build(\r\n self,\r\n page_id: str,\r\n properties: str,\r\n notion_secret: str,\r\n ) -> Record:\r\n url = f\"https://api.notion.com/v1/pages/{page_id}\"\r\n headers = {\r\n \"Authorization\": f\"Bearer {notion_secret}\",\r\n \"Content-Type\": \"application/json\",\r\n \"Notion-Version\": \"2022-06-28\", # Use the latest supported version\r\n }\r\n\r\n try:\r\n parsed_properties = json.loads(properties)\r\n except json.JSONDecodeError as e:\r\n raise ValueError(\"Invalid JSON format for properties\") from e\r\n\r\n data = {\r\n \"properties\": parsed_properties\r\n }\r\n\r\n response = requests.patch(url, headers=headers, json=data)\r\n response.raise_for_status()\r\n\r\n updated_page = response.json()\r\n\r\n output = \"Updated page properties:\\n\"\r\n for prop_name, prop_value in updated_page[\"properties\"].items():\r\n output += f\"{prop_name}: {prop_value}\\n\"\r\n\r\n self.status = output\r\n return Record(data=updated_page)","fileTypes":[],"file_path":"","password":false,"name":"code","advanced":true,"dynamic":true,"info":"","load_from_db":false,"title_case":false},"notion_secret":{"type":"str","required":true,"placeholder":"","list":false,"show":true,"multiline":false,"fileTypes":[],"file_path":"","password":true,"name":"notion_secret","display_name":"Notion Secret","advanced":false,"dynamic":false,"info":"The Notion integration token.","load_from_db":true,"title_case":false,"input_types":["Text"],"value":""},"page_id":{"type":"str","required":true,"placeholder":"","list":false,"show":true,"multiline":false,"fileTypes":[],"file_path":"","password":false,"name":"page_id","display_name":"Page ID","advanced":false,"dynamic":false,"info":"The ID of the Notion page to update.","load_from_db":false,"title_case":false,"input_types":["Text"]},"properties":{"type":"str","required":true,"placeholder":"","list":false,"show":true,"multiline":true,"fileTypes":[],"file_path":"","password":false,"name":"properties","display_name":"Properties","advanced":false,"dynamic":false,"info":"The properties to update on the page (as a JSON string).","load_from_db":false,"title_case":false,"input_types":["Text"],"value":"{ \"title\": [ { \"text\": { \"content\": \"Test Page\" } } ] }"},"_type":"CustomComponent"},"description":"Update the properties of a Notion page.","icon":"NotionDirectoryLoader","base_classes":["Record"],"display_name":"Update Page Property [Notion]","documentation":"https://docs.langflow.org/integrations/notion/page-update","custom_fields":{"page_id":null,"properties":null,"notion_secret":null},"output_types":["Record"],"field_formatters":{},"frozen":false,"field_order":[],"beta":false},"id":"CustomComponent-wHlSz","description":"Update the properties of a Notion page.","display_name":"Update Page Property [Notion]"},"selected":false,"width":384,"height":477,"dragging":false,"positionAbsolute":{"x":-2668.7714642455403,"y":-657.2376228212606}},{"id":"CustomComponent-oelYw","type":"genericNode","position":{"x":-2253.1007124701327,"y":-448.47240118604134},"data":{"type":"CustomComponent","node":{"template":{"code":{"type":"code","required":true,"placeholder":"","list":false,"show":true,"multiline":true,"value":"import requests\r\nfrom typing import Dict, Any\r\n\r\nfrom langflow import CustomComponent\r\nfrom langflow.schema import Record\r\n\r\n\r\nclass NotionPageContent(CustomComponent):\r\n display_name = \"Page Content Viewer [Notion]\"\r\n description = \"Retrieve the content of a Notion page as plain text.\"\r\n documentation: str = \"https://docs.langflow.org/integrations/notion/page-content-viewer\"\r\n icon = \"NotionDirectoryLoader\"\r\n\r\n def build_config(self):\r\n return {\r\n \"page_id\": {\r\n \"display_name\": \"Page ID\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The ID of the Notion page to retrieve.\",\r\n },\r\n \"notion_secret\": {\r\n \"display_name\": \"Notion Secret\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The Notion integration token.\",\r\n \"password\": True,\r\n },\r\n }\r\n\r\n def build(\r\n self,\r\n page_id: str,\r\n notion_secret: str,\r\n ) -> Record:\r\n blocks_url = f\"https://api.notion.com/v1/blocks/{page_id}/children?page_size=100\"\r\n headers = {\r\n \"Authorization\": f\"Bearer {notion_secret}\",\r\n \"Notion-Version\": \"2022-06-28\", # Use the latest supported version\r\n }\r\n\r\n # Retrieve the child blocks\r\n blocks_response = requests.get(blocks_url, headers=headers)\r\n blocks_response.raise_for_status()\r\n blocks_data = blocks_response.json()\r\n\r\n # Parse the blocks and extract the content as plain text\r\n content = self.parse_blocks(blocks_data[\"results\"])\r\n\r\n self.status = content\r\n return Record(data={\"content\": content}, text=content)\r\n\r\n def parse_blocks(self, blocks: list) -> str:\r\n content = \"\"\r\n for block in blocks:\r\n block_type = block[\"type\"]\r\n if block_type in [\"paragraph\", \"heading_1\", \"heading_2\", \"heading_3\", \"quote\"]:\r\n content += self.parse_rich_text(block[block_type][\"rich_text\"]) + \"\\n\\n\"\r\n elif block_type in [\"bulleted_list_item\", \"numbered_list_item\"]:\r\n content += self.parse_rich_text(block[block_type][\"rich_text\"]) + \"\\n\"\r\n elif block_type == \"to_do\":\r\n content += self.parse_rich_text(block[\"to_do\"][\"rich_text\"]) + \"\\n\"\r\n elif block_type == \"code\":\r\n content += self.parse_rich_text(block[\"code\"][\"rich_text\"]) + \"\\n\\n\"\r\n elif block_type == \"image\":\r\n content += f\"[Image: {block['image']['external']['url']}]\\n\\n\"\r\n elif block_type == \"divider\":\r\n content += \"---\\n\\n\"\r\n return content.strip()\r\n\r\n def parse_rich_text(self, rich_text: list) -> str:\r\n text = \"\"\r\n for segment in rich_text:\r\n text += segment[\"plain_text\"]\r\n return text","fileTypes":[],"file_path":"","password":false,"name":"code","advanced":true,"dynamic":true,"info":"","load_from_db":false,"title_case":false},"notion_secret":{"type":"str","required":true,"placeholder":"","list":false,"show":true,"multiline":false,"fileTypes":[],"file_path":"","password":true,"name":"notion_secret","display_name":"Notion Secret","advanced":false,"dynamic":false,"info":"The Notion integration token.","load_from_db":true,"title_case":false,"input_types":["Text"],"value":""},"page_id":{"type":"str","required":true,"placeholder":"","list":false,"show":true,"multiline":false,"fileTypes":[],"file_path":"","password":false,"name":"page_id","display_name":"Page ID","advanced":false,"dynamic":false,"info":"The ID of the Notion page to retrieve.","load_from_db":false,"title_case":false,"input_types":["Text"]},"_type":"CustomComponent"},"description":"Retrieve the content of a Notion page as plain text.","icon":"NotionDirectoryLoader","base_classes":["Record"],"display_name":"Page Content Viewer [Notion] ","documentation":"https://docs.langflow.org/integrations/notion/page-content-viewer","custom_fields":{"page_id":null,"notion_secret":null},"output_types":["Record"],"field_formatters":{},"frozen":false,"field_order":[],"beta":false},"id":"CustomComponent-oelYw","description":"Retrieve the content of a Notion page as plain text.","display_name":"Page Content Viewer [Notion] "},"selected":false,"width":384,"height":383,"positionAbsolute":{"x":-2253.1007124701327,"y":-448.47240118604134},"dragging":false},{"id":"CustomComponent-Pn52w","type":"genericNode","position":{"x":-3070.9222948695096,"y":-472.4537855763852},"data":{"type":"CustomComponent","node":{"template":{"code":{"type":"code","required":true,"placeholder":"","list":false,"show":true,"multiline":true,"value":"import requests\r\nimport json\r\nfrom typing import Dict, Any, List\r\nfrom langflow.custom import CustomComponent\r\nfrom langflow.schema import Record\r\n\r\nclass NotionListPages(CustomComponent):\r\n display_name = \"List Pages [Notion]\"\r\n description = (\r\n \"Query a Notion database with filtering and sorting. \"\r\n \"The input should be a JSON string containing the 'filter' and 'sorts' objects. \"\r\n \"Example input:\\n\"\r\n '{\"filter\": {\"property\": \"Status\", \"select\": {\"equals\": \"Done\"}}, \"sorts\": [{\"timestamp\": \"created_time\", \"direction\": \"descending\"}]}'\r\n )\r\n documentation: str = \"https://docs.langflow.org/integrations/notion/list-pages\"\r\n icon = \"NotionDirectoryLoader\"\r\n\r\n field_order = [\r\n \"notion_secret\",\r\n \"database_id\",\r\n \"query_payload\",\r\n ]\r\n\r\n def build_config(self):\r\n return {\r\n \"notion_secret\": {\r\n \"display_name\": \"Notion Secret\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The Notion integration token.\",\r\n \"password\": True,\r\n },\r\n \"database_id\": {\r\n \"display_name\": \"Database ID\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The ID of the Notion database to query.\",\r\n },\r\n \"query_payload\": {\r\n \"display_name\": \"Database query\",\r\n \"field_type\": \"str\",\r\n \"info\": \"A JSON string containing the filters that will be used for querying the database. EG: {'filter': {'property': 'Status', 'status': {'equals': 'In progress'}}}\",\r\n },\r\n }\r\n\r\n def build(\r\n self,\r\n notion_secret: str,\r\n database_id: str,\r\n query_payload: str = \"{}\",\r\n ) -> List[Record]:\r\n try:\r\n query_data = json.loads(query_payload)\r\n filter_obj = query_data.get(\"filter\")\r\n sorts = query_data.get(\"sorts\", [])\r\n\r\n url = f\"https://api.notion.com/v1/databases/{database_id}/query\"\r\n headers = {\r\n \"Authorization\": f\"Bearer {notion_secret}\",\r\n \"Content-Type\": \"application/json\",\r\n \"Notion-Version\": \"2022-06-28\",\r\n }\r\n\r\n data = {\r\n \"sorts\": sorts,\r\n }\r\n\r\n if filter_obj:\r\n data[\"filter\"] = filter_obj\r\n\r\n response = requests.post(url, headers=headers, json=data)\r\n response.raise_for_status()\r\n\r\n results = response.json()\r\n records = []\r\n combined_text = f\"Pages found: {len(results['results'])}\\n\\n\"\r\n for page in results['results']:\r\n page_data = {\r\n 'id': page['id'],\r\n 'url': page['url'],\r\n 'created_time': page['created_time'],\r\n 'last_edited_time': page['last_edited_time'],\r\n 'properties': page['properties'],\r\n }\r\n\r\n text = (\r\n f\"id: {page['id']}\\n\"\r\n f\"url: {page['url']}\\n\"\r\n f\"created_time: {page['created_time']}\\n\"\r\n f\"last_edited_time: {page['last_edited_time']}\\n\"\r\n f\"properties: {json.dumps(page['properties'], indent=2)}\\n\\n\"\r\n )\r\n\r\n combined_text += text\r\n records.append(Record(text=text, data=page_data))\r\n \r\n self.status = combined_text.strip()\r\n return records\r\n\r\n except Exception as e:\r\n self.status = f\"An error occurred: {str(e)}\"\r\n return [Record(text=self.status, data=[])]","fileTypes":[],"file_path":"","password":false,"name":"code","advanced":true,"dynamic":true,"info":"","load_from_db":false,"title_case":false},"database_id":{"type":"str","required":true,"placeholder":"","list":false,"show":true,"multiline":false,"fileTypes":[],"file_path":"","password":false,"name":"database_id","display_name":"Database ID","advanced":false,"dynamic":false,"info":"The ID of the Notion database to query.","load_from_db":true,"title_case":false,"input_types":["Text"],"value":"NOTION_NMSTX_DB_ID"},"notion_secret":{"type":"str","required":true,"placeholder":"","list":false,"show":true,"multiline":false,"fileTypes":[],"file_path":"","password":true,"name":"notion_secret","display_name":"Notion Secret","advanced":false,"dynamic":false,"info":"The Notion integration token.","load_from_db":true,"title_case":false,"input_types":["Text"],"value":""},"query_payload":{"type":"str","required":false,"placeholder":"","list":false,"show":true,"multiline":false,"value":{},"fileTypes":[],"file_path":"","password":false,"name":"query_payload","display_name":"Database query","advanced":false,"dynamic":false,"info":"A JSON string containing the filters that will be used for querying the database. EG: {'filter': {'property': 'Status', 'status': {'equals': 'In progress'}}}","load_from_db":false,"title_case":false,"input_types":["Text"]},"_type":"CustomComponent"},"description":"Query a Notion database with filtering and sorting. The input should be a JSON string containing the 'filter' and 'sorts' objects. Example input:\n{\"filter\": {\"property\": \"Status\", \"select\": {\"equals\": \"Done\"}}, \"sorts\": [{\"timestamp\": \"created_time\", \"direction\": \"descending\"}]}","icon":"NotionDirectoryLoader","base_classes":["Record"],"display_name":"List Pages [Notion] ","documentation":"https://docs.langflow.org/integrations/notion/list-pages","custom_fields":{"notion_secret":null,"database_id":null,"query_payload":null},"output_types":["Record"],"field_formatters":{},"frozen":false,"field_order":["notion_secret","database_id","query_payload"],"beta":false},"id":"CustomComponent-Pn52w","description":"Query a Notion database with filtering and sorting. The input should be a JSON string containing the 'filter' and 'sorts' objects. Example input:\n{\"filter\": {\"property\": \"Status\", \"select\": {\"equals\": \"Done\"}}, \"sorts\": [{\"timestamp\": \"created_time\", \"direction\": \"descending\"}]}","display_name":"List Pages [Notion] "},"selected":false,"width":384,"height":517,"positionAbsolute":{"x":-3070.9222948695096,"y":-472.4537855763852},"dragging":false},{"id":"CustomComponent-I8Dec","type":"genericNode","position":{"x":-2256.686402636563,"y":-963.4541117792749},"data":{"type":"CustomComponent","node":{"template":{"block_id":{"type":"str","required":true,"placeholder":"","list":false,"show":true,"multiline":false,"fileTypes":[],"file_path":"","password":false,"name":"block_id","display_name":"Page/Block ID","advanced":false,"dynamic":false,"info":"The ID of the page/block to add the content.","load_from_db":false,"title_case":false,"input_types":["Text"]},"code":{"type":"code","required":true,"placeholder":"","list":false,"show":true,"multiline":true,"value":"import json\r\nfrom typing import List, Dict, Any\r\nfrom markdown import markdown\r\nfrom bs4 import BeautifulSoup\r\nimport requests\r\n\r\nfrom langflow import CustomComponent\r\nfrom langflow.schema import Record\r\n\r\nclass AddContentToPage(CustomComponent):\r\n display_name = \"Add Content to Page [Notion]\"\r\n description = \"Convert markdown text to Notion blocks and append them to a Notion page.\"\r\n documentation: str = \"https://developers.notion.com/reference/patch-block-children\"\r\n icon = \"NotionDirectoryLoader\"\r\n\r\n def build_config(self):\r\n return {\r\n \"markdown_text\": {\r\n \"display_name\": \"Markdown Text\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The markdown text to convert to Notion blocks.\",\r\n \"multiline\": True,\r\n },\r\n \"block_id\": {\r\n \"display_name\": \"Page/Block ID\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The ID of the page/block to add the content.\",\r\n },\r\n \"notion_secret\": {\r\n \"display_name\": \"Notion Secret\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The Notion integration token.\",\r\n \"password\": True,\r\n },\r\n }\r\n\r\n def build(self, markdown_text: str, block_id: str, notion_secret: str) -> Record:\r\n html_text = markdown(markdown_text)\r\n soup = BeautifulSoup(html_text, 'html.parser')\r\n blocks = self.process_node(soup)\r\n\r\n url = f\"https://api.notion.com/v1/blocks/{block_id}/children\"\r\n headers = {\r\n \"Authorization\": f\"Bearer {notion_secret}\",\r\n \"Content-Type\": \"application/json\",\r\n \"Notion-Version\": \"2022-06-28\",\r\n }\r\n\r\n data = {\r\n \"children\": blocks,\r\n }\r\n\r\n response = requests.patch(url, headers=headers, json=data)\r\n self.status = str(response.json())\r\n response.raise_for_status()\r\n\r\n result = response.json()\r\n self.status = f\"Appended {len(blocks)} blocks to page with ID: {block_id}\"\r\n return Record(data=result, text=json.dumps(result))\r\n\r\n def process_node(self, node):\r\n blocks = []\r\n if isinstance(node, str):\r\n text = node.strip()\r\n if text:\r\n if text.startswith('#'):\r\n heading_level = text.count('#', 0, 6)\r\n heading_text = text[heading_level:].strip()\r\n if heading_level == 1:\r\n blocks.append(self.create_block('heading_1', heading_text))\r\n elif heading_level == 2:\r\n blocks.append(self.create_block('heading_2', heading_text))\r\n elif heading_level == 3:\r\n blocks.append(self.create_block('heading_3', heading_text))\r\n else:\r\n blocks.append(self.create_block('paragraph', text))\r\n elif node.name == 'h1':\r\n blocks.append(self.create_block('heading_1', node.get_text(strip=True)))\r\n elif node.name == 'h2':\r\n blocks.append(self.create_block('heading_2', node.get_text(strip=True)))\r\n elif node.name == 'h3':\r\n blocks.append(self.create_block('heading_3', node.get_text(strip=True)))\r\n elif node.name == 'p':\r\n code_node = node.find('code')\r\n if code_node:\r\n code_text = code_node.get_text()\r\n language, code = self.extract_language_and_code(code_text)\r\n blocks.append(self.create_block('code', code, language=language))\r\n elif self.is_table(str(node)):\r\n blocks.extend(self.process_table(node))\r\n else:\r\n blocks.append(self.create_block('paragraph', node.get_text(strip=True)))\r\n elif node.name == 'ul':\r\n blocks.extend(self.process_list(node, 'bulleted_list_item'))\r\n elif node.name == 'ol':\r\n blocks.extend(self.process_list(node, 'numbered_list_item'))\r\n elif node.name == 'blockquote':\r\n blocks.append(self.create_block('quote', node.get_text(strip=True)))\r\n elif node.name == 'hr':\r\n blocks.append(self.create_block('divider', ''))\r\n elif node.name == 'img':\r\n blocks.append(self.create_block('image', '', image_url=node.get('src')))\r\n elif node.name == 'a':\r\n blocks.append(self.create_block('bookmark', node.get_text(strip=True), link_url=node.get('href')))\r\n elif node.name == 'table':\r\n blocks.extend(self.process_table(node))\r\n\r\n for child in node.children:\r\n if isinstance(child, str):\r\n continue\r\n blocks.extend(self.process_node(child))\r\n\r\n return blocks\r\n\r\n def extract_language_and_code(self, code_text):\r\n lines = code_text.split('\\n')\r\n language = lines[0].strip()\r\n code = '\\n'.join(lines[1:]).strip()\r\n return language, code\r\n\r\n def is_code_block(self, text):\r\n return text.startswith('```')\r\n\r\n def extract_code_block(self, text):\r\n lines = text.split('\\n')\r\n language = lines[0].strip('`').strip()\r\n code = '\\n'.join(lines[1:]).strip('`').strip()\r\n return language, code\r\n \r\n def is_table(self, text):\r\n rows = text.split('\\n')\r\n if len(rows) < 2:\r\n return False\r\n\r\n has_separator = False\r\n for i, row in enumerate(rows):\r\n if '|' in row:\r\n cells = [cell.strip() for cell in row.split('|')]\r\n cells = [cell for cell in cells if cell] # Remove empty cells\r\n if i == 1 and all(set(cell) <= set('-|') for cell in cells):\r\n has_separator = True\r\n elif not cells:\r\n return False\r\n\r\n return has_separator and len(rows) >= 3\r\n\r\n def process_list(self, node, list_type):\r\n blocks = []\r\n for item in node.find_all('li'):\r\n item_text = item.get_text(strip=True)\r\n checked = item_text.startswith('[x]')\r\n is_checklist = item_text.startswith('[ ]') or checked\r\n\r\n if is_checklist:\r\n item_text = item_text.replace('[x]', '').replace('[ ]', '').strip()\r\n blocks.append(self.create_block('to_do', item_text, checked=checked))\r\n else:\r\n blocks.append(self.create_block(list_type, item_text))\r\n return blocks\r\n\r\n def process_table(self, node):\r\n blocks = []\r\n header_row = node.find('thead').find('tr') if node.find('thead') else None\r\n body_rows = node.find('tbody').find_all('tr') if node.find('tbody') else []\r\n\r\n if header_row or body_rows:\r\n table_width = max(len(header_row.find_all(['th', 'td'])) if header_row else 0,\r\n max(len(row.find_all(['th', 'td'])) for row in body_rows))\r\n\r\n table_block = self.create_block('table', '', table_width=table_width, has_column_header=bool(header_row))\r\n blocks.append(table_block)\r\n\r\n if header_row:\r\n header_cells = [cell.get_text(strip=True) for cell in header_row.find_all(['th', 'td'])]\r\n header_row_block = self.create_block('table_row', header_cells)\r\n blocks.append(header_row_block)\r\n\r\n for row in body_rows:\r\n cells = [cell.get_text(strip=True) for cell in row.find_all(['th', 'td'])]\r\n row_block = self.create_block('table_row', cells)\r\n blocks.append(row_block)\r\n\r\n return blocks\r\n \r\n def create_block(self, block_type: str, content: str, **kwargs) -> Dict[str, Any]:\r\n block = {\r\n \"object\": \"block\",\r\n \"type\": block_type,\r\n block_type: {},\r\n }\r\n\r\n if block_type in [\"paragraph\", \"heading_1\", \"heading_2\", \"heading_3\", \"bulleted_list_item\", \"numbered_list_item\", \"quote\"]:\r\n block[block_type][\"rich_text\"] = [\r\n {\r\n \"type\": \"text\",\r\n \"text\": {\r\n \"content\": content,\r\n },\r\n }\r\n ]\r\n elif block_type == 'to_do':\r\n block[block_type][\"rich_text\"] = [\r\n {\r\n \"type\": \"text\",\r\n \"text\": {\r\n \"content\": content,\r\n },\r\n }\r\n ]\r\n block[block_type]['checked'] = kwargs.get('checked', False)\r\n elif block_type == 'code':\r\n block[block_type]['rich_text'] = [\r\n {\r\n \"type\": \"text\",\r\n \"text\": {\r\n \"content\": content,\r\n },\r\n }\r\n ]\r\n block[block_type]['language'] = kwargs.get('language', 'plain text')\r\n elif block_type == 'image':\r\n block[block_type] = {\r\n \"type\": \"external\",\r\n \"external\": {\r\n \"url\": kwargs.get('image_url', '')\r\n }\r\n }\r\n elif block_type == 'divider':\r\n pass\r\n elif block_type == 'bookmark':\r\n block[block_type]['url'] = kwargs.get('link_url', '')\r\n elif block_type == 'table':\r\n block[block_type]['table_width'] = kwargs.get('table_width', 0)\r\n block[block_type]['has_column_header'] = kwargs.get('has_column_header', False)\r\n block[block_type]['has_row_header'] = kwargs.get('has_row_header', False)\r\n elif block_type == 'table_row':\r\n block[block_type]['cells'] = [[{'type': 'text', 'text': {'content': cell}} for cell in content]]\r\n\r\n return block","fileTypes":[],"file_path":"","password":false,"name":"code","advanced":true,"dynamic":true,"info":"","load_from_db":false,"title_case":false},"markdown_text":{"type":"str","required":true,"placeholder":"","list":false,"show":true,"multiline":true,"fileTypes":[],"file_path":"","password":false,"name":"markdown_text","display_name":"Markdown Text","advanced":false,"dynamic":false,"info":"The markdown text to convert to Notion blocks.","load_from_db":false,"title_case":false,"input_types":["Text"],"value":"# Heading 1\n\n## Heading 2\n\n### Heading 3\n\nThis is a regular paragraph.\n\nHere's another paragraph with an image:\n\n\n## Checklist\n- [x] Completed task\n- [ ] Incomplete task\n- [x] Another completed task\n\n## Numbered List\n1. First item\n2. Second item\n3. Third item\n\n## Bulleted List\n- Item 1\n- Item 2\n- Item 3\n\n## Code Block\n```python\ndef hello_world():\n print(\"Hello, World!\")\n```\n\n## Quote\n> This is a blockquote.\n> It can span multiple lines.\n\n## Horizontal Rule\n---\n\n\n## Link\n[Notion API Documentation](https://developers.notion.com)\n\n"},"notion_secret":{"type":"str","required":true,"placeholder":"","list":false,"show":true,"multiline":false,"fileTypes":[],"file_path":"","password":true,"name":"notion_secret","display_name":"Notion Secret","advanced":false,"dynamic":false,"info":"The Notion integration token.","load_from_db":true,"title_case":false,"input_types":["Text"],"value":""},"_type":"CustomComponent"},"description":"Convert markdown text to Notion blocks and append them to a Notion page.","icon":"NotionDirectoryLoader","base_classes":["Record"],"display_name":"Add Content to Page [Notion] ","documentation":"https://developers.notion.com/reference/patch-block-children","custom_fields":{"markdown_text":null,"block_id":null,"notion_secret":null},"output_types":["Record"],"field_formatters":{},"frozen":false,"field_order":[],"beta":false,"official":false},"id":"CustomComponent-I8Dec"},"selected":false,"width":384,"height":497,"positionAbsolute":{"x":-2256.686402636563,"y":-963.4541117792749},"dragging":false},{"id":"CustomComponent-ZcsA9","type":"genericNode","position":{"x":-3488.029350341937,"y":-965.3756250644985},"data":{"type":"CustomComponent","node":{"template":{"code":{"type":"code","required":true,"placeholder":"","list":false,"show":true,"multiline":true,"value":"import requests\r\nfrom typing import Dict, Any, List\r\nfrom langflow.custom import CustomComponent\r\nfrom langflow.schema import Record\r\n\r\nclass NotionSearch(CustomComponent):\r\n display_name = \"Search Notion\"\r\n description = (\r\n \"Searches all pages and databases that have been shared with an integration.\"\r\n )\r\n documentation: str = \"https://docs.langflow.org/integrations/notion/search\"\r\n icon = \"NotionDirectoryLoader\"\r\n\r\n field_order = [\r\n \"notion_secret\",\r\n \"query\",\r\n \"filter_value\",\r\n \"sort_direction\",\r\n ]\r\n\r\n def build_config(self):\r\n return {\r\n \"notion_secret\": {\r\n \"display_name\": \"Notion Secret\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The Notion integration token.\",\r\n \"password\": True,\r\n },\r\n \"query\": {\r\n \"display_name\": \"Search Query\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The text that the API compares page and database titles against.\",\r\n },\r\n \"filter_value\": {\r\n \"display_name\": \"Filter Type\",\r\n \"field_type\": \"str\",\r\n \"info\": \"Limits the results to either only pages or only databases.\",\r\n \"options\": [\"page\", \"database\"],\r\n \"default_value\": \"page\",\r\n },\r\n \"sort_direction\": {\r\n \"display_name\": \"Sort Direction\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The direction to sort the results.\",\r\n \"options\": [\"ascending\", \"descending\"],\r\n \"default_value\": \"descending\",\r\n },\r\n }\r\n\r\n def build(\r\n self,\r\n notion_secret: str,\r\n query: str = \"\",\r\n filter_value: str = \"page\",\r\n sort_direction: str = \"descending\",\r\n ) -> List[Record]:\r\n try:\r\n url = \"https://api.notion.com/v1/search\"\r\n headers = {\r\n \"Authorization\": f\"Bearer {notion_secret}\",\r\n \"Content-Type\": \"application/json\",\r\n \"Notion-Version\": \"2022-06-28\",\r\n }\r\n\r\n data = {\r\n \"query\": query,\r\n \"filter\": {\r\n \"value\": filter_value,\r\n \"property\": \"object\"\r\n },\r\n \"sort\":{\r\n \"direction\": sort_direction,\r\n \"timestamp\": \"last_edited_time\"\r\n }\r\n }\r\n\r\n response = requests.post(url, headers=headers, json=data)\r\n response.raise_for_status()\r\n\r\n results = response.json()\r\n records = []\r\n combined_text = f\"Results found: {len(results['results'])}\\n\\n\"\r\n for result in results['results']:\r\n result_data = {\r\n 'id': result['id'],\r\n 'type': result['object'],\r\n 'last_edited_time': result['last_edited_time'],\r\n }\r\n \r\n if result['object'] == 'page':\r\n result_data['title_or_url'] = result['url']\r\n text = f\"id: {result['id']}\\ntitle_or_url: {result['url']}\\n\"\r\n elif result['object'] == 'database':\r\n if 'title' in result and isinstance(result['title'], list) and len(result['title']) > 0:\r\n result_data['title_or_url'] = result['title'][0]['plain_text']\r\n text = f\"id: {result['id']}\\ntitle_or_url: {result['title'][0]['plain_text']}\\n\"\r\n else:\r\n result_data['title_or_url'] = \"N/A\"\r\n text = f\"id: {result['id']}\\ntitle_or_url: N/A\\n\"\r\n\r\n text += f\"type: {result['object']}\\nlast_edited_time: {result['last_edited_time']}\\n\\n\"\r\n combined_text += text\r\n records.append(Record(text=text, data=result_data))\r\n \r\n self.status = combined_text\r\n return records\r\n\r\n except Exception as e:\r\n self.status = f\"An error occurred: {str(e)}\"\r\n return [Record(text=self.status, data=[])]","fileTypes":[],"file_path":"","password":false,"name":"code","advanced":true,"dynamic":true,"info":"","load_from_db":false,"title_case":false},"filter_value":{"type":"str","required":false,"placeholder":"","list":true,"show":true,"multiline":false,"value":"database","fileTypes":[],"file_path":"","password":false,"options":["page","database"],"name":"filter_value","display_name":"Filter Type","advanced":false,"dynamic":false,"info":"Limits the results to either only pages or only databases.","load_from_db":false,"title_case":false,"input_types":["Text"]},"notion_secret":{"type":"str","required":true,"placeholder":"","list":false,"show":true,"multiline":false,"fileTypes":[],"file_path":"","password":true,"name":"notion_secret","display_name":"Notion Secret","advanced":false,"dynamic":false,"info":"The Notion integration token.","load_from_db":true,"title_case":false,"input_types":["Text"],"value":""},"query":{"type":"str","required":false,"placeholder":"","list":false,"show":true,"multiline":false,"value":"","fileTypes":[],"file_path":"","password":false,"name":"query","display_name":"Search Query","advanced":false,"dynamic":false,"info":"The text that the API compares page and database titles against.","load_from_db":false,"title_case":false,"input_types":["Text"]},"sort_direction":{"type":"str","required":false,"placeholder":"","list":true,"show":true,"multiline":false,"value":"descending","fileTypes":[],"file_path":"","password":false,"options":["ascending","descending"],"name":"sort_direction","display_name":"Sort Direction","advanced":false,"dynamic":false,"info":"The direction to sort the results.","load_from_db":false,"title_case":false,"input_types":["Text"]},"_type":"CustomComponent"},"description":"Searches all pages and databases that have been shared with an integration.","icon":"NotionDirectoryLoader","base_classes":["Record"],"display_name":"Search [Notion]","documentation":"https://docs.langflow.org/integrations/notion/search","custom_fields":{"notion_secret":null,"query":null,"filter_value":null,"sort_direction":null},"output_types":["Record"],"field_formatters":{},"frozen":false,"field_order":["notion_secret","query","filter_value","sort_direction"],"beta":false},"id":"CustomComponent-ZcsA9","description":"Searches all pages and databases that have been shared with an integration.","display_name":"Search [Notion]"},"selected":false,"width":384,"height":591,"positionAbsolute":{"x":-3488.029350341937,"y":-965.3756250644985},"dragging":false}],"edges":[],"viewport":{"x":2623.378922967084,"y":696.8541079344027,"zoom":0.5981384177708997}},"description":"A Bundle containing Notion components for Page and Database manipulation. You can list pages, users databases, update properties, create new pages and add content to Notion Pages.","name":"Notion - Components","last_tested_version":"1.0.0a36","is_component":false}
\ No newline at end of file
+{
+ "id": "7cd51434-9767-450f-8742-27857367f8c2",
+ "data": {
+ "nodes": [
+ {
+ "id": "RecordsToText-Q69g5",
+ "type": "genericNode",
+ "position": { "x": -2671.5528488127866, "y": -963.4266471378126 },
+ "data": {
+ "type": "RecordsToText",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "import requests\r\nfrom typing import List\r\n\r\nfrom langflow import CustomComponent\r\nfrom langflow.schema import Record\r\n\r\n\r\nclass NotionUserList(CustomComponent):\r\n display_name = \"List Users [Notion]\"\r\n description = \"Retrieve users from Notion.\"\r\n documentation: str = \"https://docs.langflow.org/integrations/notion/list-users\"\r\n icon = \"NotionDirectoryLoader\"\r\n \r\n def build_config(self):\r\n return {\r\n \"notion_secret\": {\r\n \"display_name\": \"Notion Secret\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The Notion integration token.\",\r\n \"password\": True,\r\n },\r\n }\r\n\r\n def build(\r\n self,\r\n notion_secret: str,\r\n ) -> List[Record]:\r\n url = \"https://api.notion.com/v1/users\"\r\n headers = {\r\n \"Authorization\": f\"Bearer {notion_secret}\",\r\n \"Notion-Version\": \"2022-06-28\",\r\n }\r\n\r\n response = requests.get(url, headers=headers)\r\n response.raise_for_status()\r\n\r\n data = response.json()\r\n results = data['results']\r\n\r\n records = []\r\n for user in results:\r\n id = user['id']\r\n type = user['type']\r\n name = user.get('name', '')\r\n avatar_url = user.get('avatar_url', '')\r\n\r\n record_data = {\r\n \"id\": id,\r\n \"type\": type,\r\n \"name\": name,\r\n \"avatar_url\": avatar_url,\r\n }\r\n\r\n output = \"User:\\n\"\r\n for key, value in record_data.items():\r\n output += f\"{key.replace('_', ' ').title()}: {value}\\n\"\r\n output += \"________________________\\n\"\r\n\r\n record = Record(text=output, data=record_data)\r\n records.append(record)\r\n\r\n self.status = \"\\n\".join(record.text for record in records)\r\n return records",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "notion_secret": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "notion_secret",
+ "display_name": "Notion Secret",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The Notion integration token.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": ""
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Retrieve users from Notion.",
+ "icon": "NotionDirectoryLoader",
+ "base_classes": ["Record"],
+ "display_name": "List Users [Notion] ",
+ "documentation": "https://docs.langflow.org/integrations/notion/list-users",
+ "custom_fields": { "notion_secret": null },
+ "output_types": ["Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "RecordsToText-Q69g5",
+ "description": "Retrieve users from Notion.",
+ "display_name": "List Users [Notion] "
+ },
+ "selected": false,
+ "width": 384,
+ "height": 289,
+ "dragging": false,
+ "positionAbsolute": {
+ "x": -2671.5528488127866,
+ "y": -963.4266471378126
+ }
+ },
+ {
+ "id": "CustomComponent-PU0K5",
+ "type": "genericNode",
+ "position": { "x": -3077.2269116193215, "y": -960.9450220159636 },
+ "data": {
+ "type": "CustomComponent",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "import json\r\nfrom typing import Optional\r\n\r\nimport requests\r\nfrom langflow.custom import CustomComponent\r\n\r\n\r\nclass NotionPageCreator(CustomComponent):\r\n display_name = \"Create Page [Notion]\"\r\n description = \"A component for creating Notion pages.\"\r\n documentation: str = \"https://docs.langflow.org/integrations/notion/page-create\"\r\n icon = \"NotionDirectoryLoader\"\r\n\r\n def build_config(self):\r\n return {\r\n \"database_id\": {\r\n \"display_name\": \"Database ID\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The ID of the Notion database.\",\r\n },\r\n \"notion_secret\": {\r\n \"display_name\": \"Notion Secret\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The Notion integration token.\",\r\n \"password\": True,\r\n },\r\n \"properties\": {\r\n \"display_name\": \"Properties\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The properties of the new page. Depending on your database setup, this can change. E.G: {'Task name': {'id': 'title', 'type': 'title', 'title': [{'type': 'text', 'text': {'content': 'Send Notion Components to LF', 'link': null}}]}}\",\r\n },\r\n }\r\n\r\n def build(\r\n self,\r\n database_id: str,\r\n notion_secret: str,\r\n properties: str = '{\"Task name\": {\"id\": \"title\", \"type\": \"title\", \"title\": [{\"type\": \"text\", \"text\": {\"content\": \"Send Notion Components to LF\", \"link\": null}}]}}',\r\n ) -> str:\r\n if not database_id or not properties:\r\n raise ValueError(\"Invalid input. Please provide 'database_id' and 'properties'.\")\r\n\r\n headers = {\r\n \"Authorization\": f\"Bearer {notion_secret}\",\r\n \"Content-Type\": \"application/json\",\r\n \"Notion-Version\": \"2022-06-28\",\r\n }\r\n\r\n data = {\r\n \"parent\": {\"database_id\": database_id},\r\n \"properties\": json.loads(properties),\r\n }\r\n\r\n response = requests.post(\"https://api.notion.com/v1/pages\", headers=headers, json=data)\r\n\r\n if response.status_code == 200:\r\n page_id = response.json()[\"id\"]\r\n self.status = f\"Successfully created Notion page with ID: {page_id}\\n {str(response.json())}\"\r\n return response.json()\r\n else:\r\n error_message = f\"Failed to create Notion page. Status code: {response.status_code}, Error: {response.text}\"\r\n self.status = error_message\r\n raise Exception(error_message)",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "database_id": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "database_id",
+ "display_name": "Database ID",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The ID of the Notion database.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "notion_secret": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "notion_secret",
+ "display_name": "Notion Secret",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The Notion integration token.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": ""
+ },
+ "properties": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "{\"Task name\": {\"id\": \"title\", \"type\": \"title\", \"title\": [{\"type\": \"text\", \"text\": {\"content\": \"Send Notion Components to LF\", \"link\": null}}]}}",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "properties",
+ "display_name": "Properties",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The properties of the new page. Depending on your database setup, this can change. E.G: {'Task name': {'id': 'title', 'type': 'title', 'title': [{'type': 'text', 'text': {'content': 'Send Notion Components to LF', 'link': null}}]}}",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "A component for creating Notion pages.",
+ "icon": "NotionDirectoryLoader",
+ "base_classes": ["object", "str", "Text"],
+ "display_name": "Create Page [Notion] ",
+ "documentation": "https://docs.langflow.org/integrations/notion/page-create",
+ "custom_fields": {
+ "database_id": null,
+ "notion_secret": null,
+ "properties": null
+ },
+ "output_types": ["Text"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "CustomComponent-PU0K5",
+ "description": "A component for creating Notion pages.",
+ "display_name": "Create Page [Notion] "
+ },
+ "selected": false,
+ "width": 384,
+ "height": 477,
+ "positionAbsolute": {
+ "x": -3077.2269116193215,
+ "y": -960.9450220159636
+ },
+ "dragging": false
+ },
+ {
+ "id": "CustomComponent-YODla",
+ "type": "genericNode",
+ "position": { "x": -3485.297183150799, "y": -362.8525892356713 },
+ "data": {
+ "type": "CustomComponent",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "import requests\r\nfrom typing import Dict\r\n\r\nfrom langflow import CustomComponent\r\nfrom langflow.schema import Record\r\n\r\n\r\nclass NotionDatabaseProperties(CustomComponent):\r\n display_name = \"List Database Properties [Notion]\"\r\n description = \"Retrieve properties of a Notion database.\"\r\n documentation: str = \"https://docs.langflow.org/integrations/notion/list-database-properties\"\r\n icon = \"NotionDirectoryLoader\"\r\n \r\n def build_config(self):\r\n return {\r\n \"database_id\": {\r\n \"display_name\": \"Database ID\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The ID of the Notion database.\",\r\n },\r\n \"notion_secret\": {\r\n \"display_name\": \"Notion Secret\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The Notion integration token.\",\r\n \"password\": True,\r\n },\r\n }\r\n\r\n def build(\r\n self,\r\n database_id: str,\r\n notion_secret: str,\r\n ) -> Record:\r\n url = f\"https://api.notion.com/v1/databases/{database_id}\"\r\n headers = {\r\n \"Authorization\": f\"Bearer {notion_secret}\",\r\n \"Notion-Version\": \"2022-06-28\", # Use the latest supported version\r\n }\r\n\r\n response = requests.get(url, headers=headers)\r\n response.raise_for_status()\r\n\r\n data = response.json()\r\n properties = data.get(\"properties\", {})\r\n\r\n record = Record(text=str(response.json()), data=properties)\r\n self.status = f\"Retrieved {len(properties)} properties from the Notion database.\\n {record.text}\"\r\n return record",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "database_id": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "database_id",
+ "display_name": "Database ID",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The ID of the Notion database.",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "NOTION_NMSTX_DB_ID"
+ },
+ "notion_secret": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "notion_secret",
+ "display_name": "Notion Secret",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The Notion integration token.",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": ""
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Retrieve properties of a Notion database.",
+ "icon": "NotionDirectoryLoader",
+ "base_classes": ["Record"],
+ "display_name": "List Database Properties [Notion] ",
+ "documentation": "https://docs.langflow.org/integrations/notion/list-database-properties",
+ "custom_fields": { "database_id": null, "notion_secret": null },
+ "output_types": ["Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "CustomComponent-YODla",
+ "description": "Retrieve properties of a Notion database.",
+ "display_name": "List Database Properties [Notion] "
+ },
+ "selected": true,
+ "width": 384,
+ "height": 383,
+ "dragging": false,
+ "positionAbsolute": { "x": -3485.297183150799, "y": -362.8525892356713 }
+ },
+ {
+ "id": "CustomComponent-wHlSz",
+ "type": "genericNode",
+ "position": { "x": -2668.7714642455403, "y": -657.2376228212606 },
+ "data": {
+ "type": "CustomComponent",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "import json\r\nimport requests\r\nfrom typing import Dict, Any\r\n\r\nfrom langflow import CustomComponent\r\nfrom langflow.schema import Record\r\n\r\n\r\nclass NotionPageUpdate(CustomComponent):\r\n display_name = \"Update Page Property [Notion]\"\r\n description = \"Update the properties of a Notion page.\"\r\n documentation: str = \"https://docs.langflow.org/integrations/notion/page-update\"\r\n icon = \"NotionDirectoryLoader\"\r\n\r\n def build_config(self):\r\n return {\r\n \"page_id\": {\r\n \"display_name\": \"Page ID\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The ID of the Notion page to update.\",\r\n },\r\n \"properties\": {\r\n \"display_name\": \"Properties\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The properties to update on the page (as a JSON string).\",\r\n \"multiline\": True,\r\n },\r\n \"notion_secret\": {\r\n \"display_name\": \"Notion Secret\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The Notion integration token.\",\r\n \"password\": True,\r\n },\r\n }\r\n\r\n def build(\r\n self,\r\n page_id: str,\r\n properties: str,\r\n notion_secret: str,\r\n ) -> Record:\r\n url = f\"https://api.notion.com/v1/pages/{page_id}\"\r\n headers = {\r\n \"Authorization\": f\"Bearer {notion_secret}\",\r\n \"Content-Type\": \"application/json\",\r\n \"Notion-Version\": \"2022-06-28\", # Use the latest supported version\r\n }\r\n\r\n try:\r\n parsed_properties = json.loads(properties)\r\n except json.JSONDecodeError as e:\r\n raise ValueError(\"Invalid JSON format for properties\") from e\r\n\r\n data = {\r\n \"properties\": parsed_properties\r\n }\r\n\r\n response = requests.patch(url, headers=headers, json=data)\r\n response.raise_for_status()\r\n\r\n updated_page = response.json()\r\n\r\n output = \"Updated page properties:\\n\"\r\n for prop_name, prop_value in updated_page[\"properties\"].items():\r\n output += f\"{prop_name}: {prop_value}\\n\"\r\n\r\n self.status = output\r\n return Record(data=updated_page)",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "notion_secret": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "notion_secret",
+ "display_name": "Notion Secret",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The Notion integration token.",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": ""
+ },
+ "page_id": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "page_id",
+ "display_name": "Page ID",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The ID of the Notion page to update.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "properties": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "properties",
+ "display_name": "Properties",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The properties to update on the page (as a JSON string).",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "{ \"title\": [ { \"text\": { \"content\": \"Test Page\" } } ] }"
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Update the properties of a Notion page.",
+ "icon": "NotionDirectoryLoader",
+ "base_classes": ["Record"],
+ "display_name": "Update Page Property [Notion]",
+ "documentation": "https://docs.langflow.org/integrations/notion/page-update",
+ "custom_fields": {
+ "page_id": null,
+ "properties": null,
+ "notion_secret": null
+ },
+ "output_types": ["Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "CustomComponent-wHlSz",
+ "description": "Update the properties of a Notion page.",
+ "display_name": "Update Page Property [Notion]"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 477,
+ "dragging": false,
+ "positionAbsolute": {
+ "x": -2668.7714642455403,
+ "y": -657.2376228212606
+ }
+ },
+ {
+ "id": "CustomComponent-oelYw",
+ "type": "genericNode",
+ "position": { "x": -2253.1007124701327, "y": -448.47240118604134 },
+ "data": {
+ "type": "CustomComponent",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "import requests\r\nfrom typing import Dict, Any\r\n\r\nfrom langflow import CustomComponent\r\nfrom langflow.schema import Record\r\n\r\n\r\nclass NotionPageContent(CustomComponent):\r\n display_name = \"Page Content Viewer [Notion]\"\r\n description = \"Retrieve the content of a Notion page as plain text.\"\r\n documentation: str = \"https://docs.langflow.org/integrations/notion/page-content-viewer\"\r\n icon = \"NotionDirectoryLoader\"\r\n\r\n def build_config(self):\r\n return {\r\n \"page_id\": {\r\n \"display_name\": \"Page ID\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The ID of the Notion page to retrieve.\",\r\n },\r\n \"notion_secret\": {\r\n \"display_name\": \"Notion Secret\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The Notion integration token.\",\r\n \"password\": True,\r\n },\r\n }\r\n\r\n def build(\r\n self,\r\n page_id: str,\r\n notion_secret: str,\r\n ) -> Record:\r\n blocks_url = f\"https://api.notion.com/v1/blocks/{page_id}/children?page_size=100\"\r\n headers = {\r\n \"Authorization\": f\"Bearer {notion_secret}\",\r\n \"Notion-Version\": \"2022-06-28\", # Use the latest supported version\r\n }\r\n\r\n # Retrieve the child blocks\r\n blocks_response = requests.get(blocks_url, headers=headers)\r\n blocks_response.raise_for_status()\r\n blocks_data = blocks_response.json()\r\n\r\n # Parse the blocks and extract the content as plain text\r\n content = self.parse_blocks(blocks_data[\"results\"])\r\n\r\n self.status = content\r\n return Record(data={\"content\": content}, text=content)\r\n\r\n def parse_blocks(self, blocks: list) -> str:\r\n content = \"\"\r\n for block in blocks:\r\n block_type = block[\"type\"]\r\n if block_type in [\"paragraph\", \"heading_1\", \"heading_2\", \"heading_3\", \"quote\"]:\r\n content += self.parse_rich_text(block[block_type][\"rich_text\"]) + \"\\n\\n\"\r\n elif block_type in [\"bulleted_list_item\", \"numbered_list_item\"]:\r\n content += self.parse_rich_text(block[block_type][\"rich_text\"]) + \"\\n\"\r\n elif block_type == \"to_do\":\r\n content += self.parse_rich_text(block[\"to_do\"][\"rich_text\"]) + \"\\n\"\r\n elif block_type == \"code\":\r\n content += self.parse_rich_text(block[\"code\"][\"rich_text\"]) + \"\\n\\n\"\r\n elif block_type == \"image\":\r\n content += f\"[Image: {block['image']['external']['url']}]\\n\\n\"\r\n elif block_type == \"divider\":\r\n content += \"---\\n\\n\"\r\n return content.strip()\r\n\r\n def parse_rich_text(self, rich_text: list) -> str:\r\n text = \"\"\r\n for segment in rich_text:\r\n text += segment[\"plain_text\"]\r\n return text",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "notion_secret": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "notion_secret",
+ "display_name": "Notion Secret",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The Notion integration token.",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": ""
+ },
+ "page_id": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "page_id",
+ "display_name": "Page ID",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The ID of the Notion page to retrieve.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Retrieve the content of a Notion page as plain text.",
+ "icon": "NotionDirectoryLoader",
+ "base_classes": ["Record"],
+ "display_name": "Page Content Viewer [Notion] ",
+ "documentation": "https://docs.langflow.org/integrations/notion/page-content-viewer",
+ "custom_fields": { "page_id": null, "notion_secret": null },
+ "output_types": ["Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "CustomComponent-oelYw",
+ "description": "Retrieve the content of a Notion page as plain text.",
+ "display_name": "Page Content Viewer [Notion] "
+ },
+ "selected": false,
+ "width": 384,
+ "height": 383,
+ "positionAbsolute": {
+ "x": -2253.1007124701327,
+ "y": -448.47240118604134
+ },
+ "dragging": false
+ },
+ {
+ "id": "CustomComponent-Pn52w",
+ "type": "genericNode",
+ "position": { "x": -3070.9222948695096, "y": -472.4537855763852 },
+ "data": {
+ "type": "CustomComponent",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "import requests\r\nimport json\r\nfrom typing import Dict, Any, List\r\nfrom langflow.custom import CustomComponent\r\nfrom langflow.schema import Record\r\n\r\nclass NotionListPages(CustomComponent):\r\n display_name = \"List Pages [Notion]\"\r\n description = (\r\n \"Query a Notion database with filtering and sorting. \"\r\n \"The input should be a JSON string containing the 'filter' and 'sorts' objects. \"\r\n \"Example input:\\n\"\r\n '{\"filter\": {\"property\": \"Status\", \"select\": {\"equals\": \"Done\"}}, \"sorts\": [{\"timestamp\": \"created_time\", \"direction\": \"descending\"}]}'\r\n )\r\n documentation: str = \"https://docs.langflow.org/integrations/notion/list-pages\"\r\n icon = \"NotionDirectoryLoader\"\r\n\r\n field_order = [\r\n \"notion_secret\",\r\n \"database_id\",\r\n \"query_payload\",\r\n ]\r\n\r\n def build_config(self):\r\n return {\r\n \"notion_secret\": {\r\n \"display_name\": \"Notion Secret\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The Notion integration token.\",\r\n \"password\": True,\r\n },\r\n \"database_id\": {\r\n \"display_name\": \"Database ID\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The ID of the Notion database to query.\",\r\n },\r\n \"query_payload\": {\r\n \"display_name\": \"Database query\",\r\n \"field_type\": \"str\",\r\n \"info\": \"A JSON string containing the filters that will be used for querying the database. EG: {'filter': {'property': 'Status', 'status': {'equals': 'In progress'}}}\",\r\n },\r\n }\r\n\r\n def build(\r\n self,\r\n notion_secret: str,\r\n database_id: str,\r\n query_payload: str = \"{}\",\r\n ) -> List[Record]:\r\n try:\r\n query_data = json.loads(query_payload)\r\n filter_obj = query_data.get(\"filter\")\r\n sorts = query_data.get(\"sorts\", [])\r\n\r\n url = f\"https://api.notion.com/v1/databases/{database_id}/query\"\r\n headers = {\r\n \"Authorization\": f\"Bearer {notion_secret}\",\r\n \"Content-Type\": \"application/json\",\r\n \"Notion-Version\": \"2022-06-28\",\r\n }\r\n\r\n data = {\r\n \"sorts\": sorts,\r\n }\r\n\r\n if filter_obj:\r\n data[\"filter\"] = filter_obj\r\n\r\n response = requests.post(url, headers=headers, json=data)\r\n response.raise_for_status()\r\n\r\n results = response.json()\r\n records = []\r\n combined_text = f\"Pages found: {len(results['results'])}\\n\\n\"\r\n for page in results['results']:\r\n page_data = {\r\n 'id': page['id'],\r\n 'url': page['url'],\r\n 'created_time': page['created_time'],\r\n 'last_edited_time': page['last_edited_time'],\r\n 'properties': page['properties'],\r\n }\r\n\r\n text = (\r\n f\"id: {page['id']}\\n\"\r\n f\"url: {page['url']}\\n\"\r\n f\"created_time: {page['created_time']}\\n\"\r\n f\"last_edited_time: {page['last_edited_time']}\\n\"\r\n f\"properties: {json.dumps(page['properties'], indent=2)}\\n\\n\"\r\n )\r\n\r\n combined_text += text\r\n records.append(Record(text=text, data=page_data))\r\n \r\n self.status = combined_text.strip()\r\n return records\r\n\r\n except Exception as e:\r\n self.status = f\"An error occurred: {str(e)}\"\r\n return [Record(text=self.status, data=[])]",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "database_id": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "database_id",
+ "display_name": "Database ID",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The ID of the Notion database to query.",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "NOTION_NMSTX_DB_ID"
+ },
+ "notion_secret": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "notion_secret",
+ "display_name": "Notion Secret",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The Notion integration token.",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": ""
+ },
+ "query_payload": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": {},
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "query_payload",
+ "display_name": "Database query",
+ "advanced": false,
+ "dynamic": false,
+ "info": "A JSON string containing the filters that will be used for querying the database. EG: {'filter': {'property': 'Status', 'status': {'equals': 'In progress'}}}",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Query a Notion database with filtering and sorting. The input should be a JSON string containing the 'filter' and 'sorts' objects. Example input:\n{\"filter\": {\"property\": \"Status\", \"select\": {\"equals\": \"Done\"}}, \"sorts\": [{\"timestamp\": \"created_time\", \"direction\": \"descending\"}]}",
+ "icon": "NotionDirectoryLoader",
+ "base_classes": ["Record"],
+ "display_name": "List Pages [Notion] ",
+ "documentation": "https://docs.langflow.org/integrations/notion/list-pages",
+ "custom_fields": {
+ "notion_secret": null,
+ "database_id": null,
+ "query_payload": null
+ },
+ "output_types": ["Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": ["notion_secret", "database_id", "query_payload"],
+ "beta": false
+ },
+ "id": "CustomComponent-Pn52w",
+ "description": "Query a Notion database with filtering and sorting. The input should be a JSON string containing the 'filter' and 'sorts' objects. Example input:\n{\"filter\": {\"property\": \"Status\", \"select\": {\"equals\": \"Done\"}}, \"sorts\": [{\"timestamp\": \"created_time\", \"direction\": \"descending\"}]}",
+ "display_name": "List Pages [Notion] "
+ },
+ "selected": false,
+ "width": 384,
+ "height": 517,
+ "positionAbsolute": {
+ "x": -3070.9222948695096,
+ "y": -472.4537855763852
+ },
+ "dragging": false
+ },
+ {
+ "id": "CustomComponent-I8Dec",
+ "type": "genericNode",
+ "position": { "x": -2256.686402636563, "y": -963.4541117792749 },
+ "data": {
+ "type": "CustomComponent",
+ "node": {
+ "template": {
+ "block_id": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "block_id",
+ "display_name": "Page/Block ID",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The ID of the page/block to add the content.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "import json\r\nfrom typing import List, Dict, Any\r\nfrom markdown import markdown\r\nfrom bs4 import BeautifulSoup\r\nimport requests\r\n\r\nfrom langflow import CustomComponent\r\nfrom langflow.schema import Record\r\n\r\nclass AddContentToPage(CustomComponent):\r\n display_name = \"Add Content to Page [Notion]\"\r\n description = \"Convert markdown text to Notion blocks and append them to a Notion page.\"\r\n documentation: str = \"https://developers.notion.com/reference/patch-block-children\"\r\n icon = \"NotionDirectoryLoader\"\r\n\r\n def build_config(self):\r\n return {\r\n \"markdown_text\": {\r\n \"display_name\": \"Markdown Text\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The markdown text to convert to Notion blocks.\",\r\n \"multiline\": True,\r\n },\r\n \"block_id\": {\r\n \"display_name\": \"Page/Block ID\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The ID of the page/block to add the content.\",\r\n },\r\n \"notion_secret\": {\r\n \"display_name\": \"Notion Secret\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The Notion integration token.\",\r\n \"password\": True,\r\n },\r\n }\r\n\r\n def build(self, markdown_text: str, block_id: str, notion_secret: str) -> Record:\r\n html_text = markdown(markdown_text)\r\n soup = BeautifulSoup(html_text, 'html.parser')\r\n blocks = self.process_node(soup)\r\n\r\n url = f\"https://api.notion.com/v1/blocks/{block_id}/children\"\r\n headers = {\r\n \"Authorization\": f\"Bearer {notion_secret}\",\r\n \"Content-Type\": \"application/json\",\r\n \"Notion-Version\": \"2022-06-28\",\r\n }\r\n\r\n data = {\r\n \"children\": blocks,\r\n }\r\n\r\n response = requests.patch(url, headers=headers, json=data)\r\n self.status = str(response.json())\r\n response.raise_for_status()\r\n\r\n result = response.json()\r\n self.status = f\"Appended {len(blocks)} blocks to page with ID: {block_id}\"\r\n return Record(data=result, text=json.dumps(result))\r\n\r\n def process_node(self, node):\r\n blocks = []\r\n if isinstance(node, str):\r\n text = node.strip()\r\n if text:\r\n if text.startswith('#'):\r\n heading_level = text.count('#', 0, 6)\r\n heading_text = text[heading_level:].strip()\r\n if heading_level == 1:\r\n blocks.append(self.create_block('heading_1', heading_text))\r\n elif heading_level == 2:\r\n blocks.append(self.create_block('heading_2', heading_text))\r\n elif heading_level == 3:\r\n blocks.append(self.create_block('heading_3', heading_text))\r\n else:\r\n blocks.append(self.create_block('paragraph', text))\r\n elif node.name == 'h1':\r\n blocks.append(self.create_block('heading_1', node.get_text(strip=True)))\r\n elif node.name == 'h2':\r\n blocks.append(self.create_block('heading_2', node.get_text(strip=True)))\r\n elif node.name == 'h3':\r\n blocks.append(self.create_block('heading_3', node.get_text(strip=True)))\r\n elif node.name == 'p':\r\n code_node = node.find('code')\r\n if code_node:\r\n code_text = code_node.get_text()\r\n language, code = self.extract_language_and_code(code_text)\r\n blocks.append(self.create_block('code', code, language=language))\r\n elif self.is_table(str(node)):\r\n blocks.extend(self.process_table(node))\r\n else:\r\n blocks.append(self.create_block('paragraph', node.get_text(strip=True)))\r\n elif node.name == 'ul':\r\n blocks.extend(self.process_list(node, 'bulleted_list_item'))\r\n elif node.name == 'ol':\r\n blocks.extend(self.process_list(node, 'numbered_list_item'))\r\n elif node.name == 'blockquote':\r\n blocks.append(self.create_block('quote', node.get_text(strip=True)))\r\n elif node.name == 'hr':\r\n blocks.append(self.create_block('divider', ''))\r\n elif node.name == 'img':\r\n blocks.append(self.create_block('image', '', image_url=node.get('src')))\r\n elif node.name == 'a':\r\n blocks.append(self.create_block('bookmark', node.get_text(strip=True), link_url=node.get('href')))\r\n elif node.name == 'table':\r\n blocks.extend(self.process_table(node))\r\n\r\n for child in node.children:\r\n if isinstance(child, str):\r\n continue\r\n blocks.extend(self.process_node(child))\r\n\r\n return blocks\r\n\r\n def extract_language_and_code(self, code_text):\r\n lines = code_text.split('\\n')\r\n language = lines[0].strip()\r\n code = '\\n'.join(lines[1:]).strip()\r\n return language, code\r\n\r\n def is_code_block(self, text):\r\n return text.startswith('```')\r\n\r\n def extract_code_block(self, text):\r\n lines = text.split('\\n')\r\n language = lines[0].strip('`').strip()\r\n code = '\\n'.join(lines[1:]).strip('`').strip()\r\n return language, code\r\n \r\n def is_table(self, text):\r\n rows = text.split('\\n')\r\n if len(rows) < 2:\r\n return False\r\n\r\n has_separator = False\r\n for i, row in enumerate(rows):\r\n if '|' in row:\r\n cells = [cell.strip() for cell in row.split('|')]\r\n cells = [cell for cell in cells if cell] # Remove empty cells\r\n if i == 1 and all(set(cell) <= set('-|') for cell in cells):\r\n has_separator = True\r\n elif not cells:\r\n return False\r\n\r\n return has_separator and len(rows) >= 3\r\n\r\n def process_list(self, node, list_type):\r\n blocks = []\r\n for item in node.find_all('li'):\r\n item_text = item.get_text(strip=True)\r\n checked = item_text.startswith('[x]')\r\n is_checklist = item_text.startswith('[ ]') or checked\r\n\r\n if is_checklist:\r\n item_text = item_text.replace('[x]', '').replace('[ ]', '').strip()\r\n blocks.append(self.create_block('to_do', item_text, checked=checked))\r\n else:\r\n blocks.append(self.create_block(list_type, item_text))\r\n return blocks\r\n\r\n def process_table(self, node):\r\n blocks = []\r\n header_row = node.find('thead').find('tr') if node.find('thead') else None\r\n body_rows = node.find('tbody').find_all('tr') if node.find('tbody') else []\r\n\r\n if header_row or body_rows:\r\n table_width = max(len(header_row.find_all(['th', 'td'])) if header_row else 0,\r\n max(len(row.find_all(['th', 'td'])) for row in body_rows))\r\n\r\n table_block = self.create_block('table', '', table_width=table_width, has_column_header=bool(header_row))\r\n blocks.append(table_block)\r\n\r\n if header_row:\r\n header_cells = [cell.get_text(strip=True) for cell in header_row.find_all(['th', 'td'])]\r\n header_row_block = self.create_block('table_row', header_cells)\r\n blocks.append(header_row_block)\r\n\r\n for row in body_rows:\r\n cells = [cell.get_text(strip=True) for cell in row.find_all(['th', 'td'])]\r\n row_block = self.create_block('table_row', cells)\r\n blocks.append(row_block)\r\n\r\n return blocks\r\n \r\n def create_block(self, block_type: str, content: str, **kwargs) -> Dict[str, Any]:\r\n block = {\r\n \"object\": \"block\",\r\n \"type\": block_type,\r\n block_type: {},\r\n }\r\n\r\n if block_type in [\"paragraph\", \"heading_1\", \"heading_2\", \"heading_3\", \"bulleted_list_item\", \"numbered_list_item\", \"quote\"]:\r\n block[block_type][\"rich_text\"] = [\r\n {\r\n \"type\": \"text\",\r\n \"text\": {\r\n \"content\": content,\r\n },\r\n }\r\n ]\r\n elif block_type == 'to_do':\r\n block[block_type][\"rich_text\"] = [\r\n {\r\n \"type\": \"text\",\r\n \"text\": {\r\n \"content\": content,\r\n },\r\n }\r\n ]\r\n block[block_type]['checked'] = kwargs.get('checked', False)\r\n elif block_type == 'code':\r\n block[block_type]['rich_text'] = [\r\n {\r\n \"type\": \"text\",\r\n \"text\": {\r\n \"content\": content,\r\n },\r\n }\r\n ]\r\n block[block_type]['language'] = kwargs.get('language', 'plain text')\r\n elif block_type == 'image':\r\n block[block_type] = {\r\n \"type\": \"external\",\r\n \"external\": {\r\n \"url\": kwargs.get('image_url', '')\r\n }\r\n }\r\n elif block_type == 'divider':\r\n pass\r\n elif block_type == 'bookmark':\r\n block[block_type]['url'] = kwargs.get('link_url', '')\r\n elif block_type == 'table':\r\n block[block_type]['table_width'] = kwargs.get('table_width', 0)\r\n block[block_type]['has_column_header'] = kwargs.get('has_column_header', False)\r\n block[block_type]['has_row_header'] = kwargs.get('has_row_header', False)\r\n elif block_type == 'table_row':\r\n block[block_type]['cells'] = [[{'type': 'text', 'text': {'content': cell}} for cell in content]]\r\n\r\n return block",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "markdown_text": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "markdown_text",
+ "display_name": "Markdown Text",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The markdown text to convert to Notion blocks.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "# Heading 1\n\n## Heading 2\n\n### Heading 3\n\nThis is a regular paragraph.\n\nHere's another paragraph with an image:\n\n\n## Checklist\n- [x] Completed task\n- [ ] Incomplete task\n- [x] Another completed task\n\n## Numbered List\n1. First item\n2. Second item\n3. Third item\n\n## Bulleted List\n- Item 1\n- Item 2\n- Item 3\n\n## Code Block\n```python\ndef hello_world():\n print(\"Hello, World!\")\n```\n\n## Quote\n> This is a blockquote.\n> It can span multiple lines.\n\n## Horizontal Rule\n---\n\n\n## Link\n[Notion API Documentation](https://developers.notion.com)\n\n"
+ },
+ "notion_secret": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "notion_secret",
+ "display_name": "Notion Secret",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The Notion integration token.",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": ""
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Convert markdown text to Notion blocks and append them to a Notion page.",
+ "icon": "NotionDirectoryLoader",
+ "base_classes": ["Record"],
+ "display_name": "Add Content to Page [Notion] ",
+ "documentation": "https://developers.notion.com/reference/patch-block-children",
+ "custom_fields": {
+ "markdown_text": null,
+ "block_id": null,
+ "notion_secret": null
+ },
+ "output_types": ["Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false,
+ "official": false
+ },
+ "id": "CustomComponent-I8Dec"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 497,
+ "positionAbsolute": {
+ "x": -2256.686402636563,
+ "y": -963.4541117792749
+ },
+ "dragging": false
+ },
+ {
+ "id": "CustomComponent-ZcsA9",
+ "type": "genericNode",
+ "position": { "x": -3488.029350341937, "y": -965.3756250644985 },
+ "data": {
+ "type": "CustomComponent",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "import requests\r\nfrom typing import Dict, Any, List\r\nfrom langflow.custom import CustomComponent\r\nfrom langflow.schema import Record\r\n\r\nclass NotionSearch(CustomComponent):\r\n display_name = \"Search Notion\"\r\n description = (\r\n \"Searches all pages and databases that have been shared with an integration.\"\r\n )\r\n documentation: str = \"https://docs.langflow.org/integrations/notion/search\"\r\n icon = \"NotionDirectoryLoader\"\r\n\r\n field_order = [\r\n \"notion_secret\",\r\n \"query\",\r\n \"filter_value\",\r\n \"sort_direction\",\r\n ]\r\n\r\n def build_config(self):\r\n return {\r\n \"notion_secret\": {\r\n \"display_name\": \"Notion Secret\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The Notion integration token.\",\r\n \"password\": True,\r\n },\r\n \"query\": {\r\n \"display_name\": \"Search Query\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The text that the API compares page and database titles against.\",\r\n },\r\n \"filter_value\": {\r\n \"display_name\": \"Filter Type\",\r\n \"field_type\": \"str\",\r\n \"info\": \"Limits the results to either only pages or only databases.\",\r\n \"options\": [\"page\", \"database\"],\r\n \"default_value\": \"page\",\r\n },\r\n \"sort_direction\": {\r\n \"display_name\": \"Sort Direction\",\r\n \"field_type\": \"str\",\r\n \"info\": \"The direction to sort the results.\",\r\n \"options\": [\"ascending\", \"descending\"],\r\n \"default_value\": \"descending\",\r\n },\r\n }\r\n\r\n def build(\r\n self,\r\n notion_secret: str,\r\n query: str = \"\",\r\n filter_value: str = \"page\",\r\n sort_direction: str = \"descending\",\r\n ) -> List[Record]:\r\n try:\r\n url = \"https://api.notion.com/v1/search\"\r\n headers = {\r\n \"Authorization\": f\"Bearer {notion_secret}\",\r\n \"Content-Type\": \"application/json\",\r\n \"Notion-Version\": \"2022-06-28\",\r\n }\r\n\r\n data = {\r\n \"query\": query,\r\n \"filter\": {\r\n \"value\": filter_value,\r\n \"property\": \"object\"\r\n },\r\n \"sort\":{\r\n \"direction\": sort_direction,\r\n \"timestamp\": \"last_edited_time\"\r\n }\r\n }\r\n\r\n response = requests.post(url, headers=headers, json=data)\r\n response.raise_for_status()\r\n\r\n results = response.json()\r\n records = []\r\n combined_text = f\"Results found: {len(results['results'])}\\n\\n\"\r\n for result in results['results']:\r\n result_data = {\r\n 'id': result['id'],\r\n 'type': result['object'],\r\n 'last_edited_time': result['last_edited_time'],\r\n }\r\n \r\n if result['object'] == 'page':\r\n result_data['title_or_url'] = result['url']\r\n text = f\"id: {result['id']}\\ntitle_or_url: {result['url']}\\n\"\r\n elif result['object'] == 'database':\r\n if 'title' in result and isinstance(result['title'], list) and len(result['title']) > 0:\r\n result_data['title_or_url'] = result['title'][0]['plain_text']\r\n text = f\"id: {result['id']}\\ntitle_or_url: {result['title'][0]['plain_text']}\\n\"\r\n else:\r\n result_data['title_or_url'] = \"N/A\"\r\n text = f\"id: {result['id']}\\ntitle_or_url: N/A\\n\"\r\n\r\n text += f\"type: {result['object']}\\nlast_edited_time: {result['last_edited_time']}\\n\\n\"\r\n combined_text += text\r\n records.append(Record(text=text, data=result_data))\r\n \r\n self.status = combined_text\r\n return records\r\n\r\n except Exception as e:\r\n self.status = f\"An error occurred: {str(e)}\"\r\n return [Record(text=self.status, data=[])]",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "filter_value": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "database",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["page", "database"],
+ "name": "filter_value",
+ "display_name": "Filter Type",
+ "advanced": false,
+ "dynamic": false,
+ "info": "Limits the results to either only pages or only databases.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "notion_secret": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "notion_secret",
+ "display_name": "Notion Secret",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The Notion integration token.",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": ""
+ },
+ "query": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "query",
+ "display_name": "Search Query",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The text that the API compares page and database titles against.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "sort_direction": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "descending",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["ascending", "descending"],
+ "name": "sort_direction",
+ "display_name": "Sort Direction",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The direction to sort the results.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Searches all pages and databases that have been shared with an integration.",
+ "icon": "NotionDirectoryLoader",
+ "base_classes": ["Record"],
+ "display_name": "Search [Notion]",
+ "documentation": "https://docs.langflow.org/integrations/notion/search",
+ "custom_fields": {
+ "notion_secret": null,
+ "query": null,
+ "filter_value": null,
+ "sort_direction": null
+ },
+ "output_types": ["Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [
+ "notion_secret",
+ "query",
+ "filter_value",
+ "sort_direction"
+ ],
+ "beta": false
+ },
+ "id": "CustomComponent-ZcsA9",
+ "description": "Searches all pages and databases that have been shared with an integration.",
+ "display_name": "Search [Notion]"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 591,
+ "positionAbsolute": {
+ "x": -3488.029350341937,
+ "y": -965.3756250644985
+ },
+ "dragging": false
+ }
+ ],
+ "edges": [],
+ "viewport": {
+ "x": 2623.378922967084,
+ "y": 696.8541079344027,
+ "zoom": 0.5981384177708997
+ }
+ },
+ "description": "A Bundle containing Notion components for Page and Database manipulation. You can list pages, users databases, update properties, create new pages and add content to Notion Pages.",
+ "name": "Notion - Components",
+ "last_tested_version": "1.0.0a36",
+ "is_component": false
+}
diff --git a/scripts/gcp/GCP_DEPLOYMENT.md b/scripts/gcp/GCP_DEPLOYMENT.md
index 9f17e550b..a848d3d2b 100644
--- a/scripts/gcp/GCP_DEPLOYMENT.md
+++ b/scripts/gcp/GCP_DEPLOYMENT.md
@@ -20,8 +20,7 @@ When running as a [spot (preemptible) instance](https://cloud.google.com/compute
## Pricing (approximate)
-> For a more accurate breakdown of costs, please use the [**GCP Pricing Calculator**](https://cloud.google.com/products/calculator)
->
+> For a more accurate breakdown of costs, please use the [**GCP Pricing Calculator**](https://cloud.google.com/products/calculator) >
| Component | Regular Cost (Hourly) | Regular Cost (Monthly) | Spot/Preemptible Cost (Hourly) | Spot/Preemptible Cost (Monthly) | Notes |
| ------------------ | --------------------- | ---------------------- | ------------------------------ | ------------------------------- | -------------------------------------------------------------------------- |
diff --git a/src/backend/base/langflow/__main__.py b/src/backend/base/langflow/__main__.py
index 4162629dd..343188336 100644
--- a/src/backend/base/langflow/__main__.py
+++ b/src/backend/base/langflow/__main__.py
@@ -121,7 +121,7 @@ def run(
),
):
"""
- Run the Langflow.
+ Run Langflow.
"""
configure(log_level=log_level, log_file=log_file)
diff --git a/src/backend/base/langflow/initial_setup/starter_projects/Basic Prompting (Hello, world!).json b/src/backend/base/langflow/initial_setup/starter_projects/Basic Prompting (Hello, world!).json
index bdc6da29d..38e01111b 100644
--- a/src/backend/base/langflow/initial_setup/starter_projects/Basic Prompting (Hello, world!).json
+++ b/src/backend/base/langflow/initial_setup/starter_projects/Basic Prompting (Hello, world!).json
@@ -1,886 +1,800 @@
{
- "id": "c091a57f-43a7-4a5e-b352-035ae8d8379c",
- "data": {
- "nodes": [
- {
- "id": "Prompt-uxBqP",
- "type": "genericNode",
- "position": {
- "x": 53.588791333410654,
- "y": -107.07318910019967
- },
- "data": {
- "type": "Prompt",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from langchain_core.prompts import PromptTemplate\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Prompt, TemplateField, Text\n\n\nclass PromptComponent(CustomComponent):\n display_name: str = \"Prompt\"\n description: str = \"Create a prompt template with dynamic variables.\"\n icon = \"prompts\"\n\n def build_config(self):\n return {\n \"template\": TemplateField(display_name=\"Template\"),\n \"code\": TemplateField(advanced=True),\n }\n\n def build(\n self,\n template: Prompt,\n **kwargs,\n ) -> Text:\n from langflow.base.prompts.utils import dict_values_to_string\n\n prompt_template = PromptTemplate.from_template(Text(template))\n kwargs = dict_values_to_string(kwargs)\n kwargs = {k: \"\\n\".join(v) if isinstance(v, list) else v for k, v in kwargs.items()}\n try:\n formated_prompt = prompt_template.format(**kwargs)\n except Exception as exc:\n raise ValueError(f\"Error formatting prompt: {exc}\") from exc\n self.status = f'Prompt:\\n\"{formated_prompt}\"'\n return formated_prompt\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "template": {
- "type": "prompt",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "Answer the user as if you were a pirate.\n\nUser: {user_input}\n\nAnswer: ",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "template",
- "display_name": "Template",
- "advanced": false,
- "input_types": [
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "_type": "CustomComponent",
- "user_input": {
- "field_type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "user_input",
- "display_name": "user_input",
- "advanced": false,
- "input_types": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "type": "str"
- }
- },
- "description": "Create a prompt template with dynamic variables.",
- "icon": "prompts",
- "is_input": null,
- "is_output": null,
- "is_composition": null,
- "base_classes": [
- "object",
- "str",
- "Text"
- ],
- "name": "",
- "display_name": "Prompt",
- "documentation": "",
- "custom_fields": {
- "template": [
- "user_input"
- ]
- },
- "output_types": [
- "Text"
- ],
- "full_path": null,
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false,
- "error": null
- },
- "id": "Prompt-uxBqP",
- "description": "Create a prompt template with dynamic variables.",
- "display_name": "Prompt"
- },
- "selected": true,
- "width": 384,
- "height": 383,
- "dragging": false,
- "positionAbsolute": {
- "x": 53.588791333410654,
- "y": -107.07318910019967
- }
+ "id": "c091a57f-43a7-4a5e-b352-035ae8d8379c",
+ "data": {
+ "nodes": [
+ {
+ "id": "Prompt-uxBqP",
+ "type": "genericNode",
+ "position": {
+ "x": 53.588791333410654,
+ "y": -107.07318910019967
+ },
+ "data": {
+ "type": "Prompt",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from langchain_core.prompts import PromptTemplate\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Prompt, TemplateField, Text\n\n\nclass PromptComponent(CustomComponent):\n display_name: str = \"Prompt\"\n description: str = \"Create a prompt template with dynamic variables.\"\n icon = \"prompts\"\n\n def build_config(self):\n return {\n \"template\": TemplateField(display_name=\"Template\"),\n \"code\": TemplateField(advanced=True),\n }\n\n def build(\n self,\n template: Prompt,\n **kwargs,\n ) -> Text:\n from langflow.base.prompts.utils import dict_values_to_string\n\n prompt_template = PromptTemplate.from_template(Text(template))\n kwargs = dict_values_to_string(kwargs)\n kwargs = {k: \"\\n\".join(v) if isinstance(v, list) else v for k, v in kwargs.items()}\n try:\n formated_prompt = prompt_template.format(**kwargs)\n except Exception as exc:\n raise ValueError(f\"Error formatting prompt: {exc}\") from exc\n self.status = f'Prompt:\\n\"{formated_prompt}\"'\n return formated_prompt\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "template": {
+ "type": "prompt",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "Answer the user as if you were a pirate.\n\nUser: {user_input}\n\nAnswer: ",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "template",
+ "display_name": "Template",
+ "advanced": false,
+ "input_types": ["Text"],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "_type": "CustomComponent",
+ "user_input": {
+ "field_type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "user_input",
+ "display_name": "user_input",
+ "advanced": false,
+ "input_types": [
+ "Document",
+ "BaseOutputParser",
+ "Record",
+ "Text"
+ ],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "type": "str"
+ }
},
- {
- "id": "OpenAIModel-k39HS",
- "type": "genericNode",
- "position": {
- "x": 634.8148772766217,
- "y": 27.035057029045305
- },
- "data": {
- "type": "OpenAIModel",
- "node": {
- "template": {
- "input_value": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Input",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional\n\nfrom langchain_openai import ChatOpenAI\nfrom pydantic.v1 import SecretStr\n\nfrom langflow.base.constants import STREAM_INFO_TEXT\nfrom langflow.base.models.model import LCModelComponent\nfrom langflow.base.models.openai_constants import MODEL_NAMES\nfrom langflow.field_typing import NestedDict, Text\n\n\nclass OpenAIModelComponent(LCModelComponent):\n display_name = \"OpenAI\"\n description = \"Generates text using OpenAI LLMs.\"\n icon = \"OpenAI\"\n\n field_order = [\n \"max_tokens\",\n \"model_kwargs\",\n \"model_name\",\n \"openai_api_base\",\n \"openai_api_key\",\n \"temperature\",\n \"input_value\",\n \"system_message\",\n \"stream\",\n ]\n\n def build_config(self):\n return {\n \"input_value\": {\"display_name\": \"Input\"},\n \"max_tokens\": {\n \"display_name\": \"Max Tokens\",\n \"advanced\": True,\n \"info\": \"The maximum number of tokens to generate. Set to 0 for unlimited tokens.\",\n },\n \"model_kwargs\": {\n \"display_name\": \"Model Kwargs\",\n \"advanced\": True,\n },\n \"model_name\": {\n \"display_name\": \"Model Name\",\n \"advanced\": False,\n \"options\": MODEL_NAMES,\n },\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"advanced\": True,\n \"info\": (\n \"The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\\n\\n\"\n \"You can change this to use other APIs like JinaChat, LocalAI and Prem.\"\n ),\n },\n \"openai_api_key\": {\n \"display_name\": \"OpenAI API Key\",\n \"info\": \"The OpenAI API Key to use for the OpenAI model.\",\n \"advanced\": False,\n \"password\": True,\n },\n \"temperature\": {\n \"display_name\": \"Temperature\",\n \"advanced\": False,\n \"value\": 0.1,\n },\n \"stream\": {\n \"display_name\": \"Stream\",\n \"info\": STREAM_INFO_TEXT,\n \"advanced\": True,\n },\n \"system_message\": {\n \"display_name\": \"System Message\",\n \"info\": \"System message to pass to the model.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n input_value: Text,\n openai_api_key: str,\n temperature: float,\n model_name: str = \"gpt-4o\",\n max_tokens: Optional[int] = 256,\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n stream: bool = False,\n system_message: Optional[str] = None,\n ) -> Text:\n if not openai_api_base:\n openai_api_base = \"https://api.openai.com/v1\"\n if openai_api_key:\n api_key = SecretStr(openai_api_key)\n else:\n api_key = None\n\n output = ChatOpenAI(\n max_tokens=max_tokens or None,\n model_kwargs=model_kwargs,\n model=model_name,\n base_url=openai_api_base,\n api_key=api_key,\n temperature=temperature,\n )\n\n return self.get_chat_result(output, stream, input_value, system_message)\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "max_tokens": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 256,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "max_tokens",
- "display_name": "Max Tokens",
- "advanced": true,
- "dynamic": false,
- "info": "The maximum number of tokens to generate. Set to 0 for unlimited tokens.",
- "load_from_db": false,
- "title_case": false
- },
- "model_kwargs": {
- "type": "NestedDict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": {},
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "model_kwargs",
- "display_name": "Model Kwargs",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "model_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "gpt-3.5-turbo",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "gpt-4o",
- "gpt-4-turbo",
- "gpt-4-turbo-preview",
- "gpt-3.5-turbo",
- "gpt-3.5-turbo-0125"
- ],
- "name": "model_name",
- "display_name": "Model Name",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_base": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "openai_api_base",
- "display_name": "OpenAI API Base",
- "advanced": true,
- "dynamic": false,
- "info": "The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\n\nYou can change this to use other APIs like JinaChat, LocalAI and Prem.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_key": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "openai_api_key",
- "display_name": "OpenAI API Key",
- "advanced": false,
- "dynamic": false,
- "info": "The OpenAI API Key to use for the OpenAI model.",
- "load_from_db": true,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "OPENAI_API_KEY"
- },
- "stream": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": true,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "stream",
- "display_name": "Stream",
- "advanced": true,
- "dynamic": false,
- "info": "Stream the response from the model. Streaming works only in Chat.",
- "load_from_db": false,
- "title_case": false
- },
- "system_message": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "system_message",
- "display_name": "System Message",
- "advanced": true,
- "dynamic": false,
- "info": "System message to pass to the model.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "temperature": {
- "type": "float",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 0.1,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "temperature",
- "display_name": "Temperature",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "rangeSpec": {
- "step_type": "float",
- "min": -1,
- "max": 1,
- "step": 0.1
- },
- "load_from_db": false,
- "title_case": false
- },
- "_type": "CustomComponent"
- },
- "description": "Generates text using OpenAI LLMs.",
- "icon": "OpenAI",
- "base_classes": [
- "object",
- "Text",
- "str"
- ],
- "display_name": "OpenAI",
- "documentation": "",
- "custom_fields": {
- "input_value": null,
- "openai_api_key": null,
- "temperature": null,
- "model_name": null,
- "max_tokens": null,
- "model_kwargs": null,
- "openai_api_base": null,
- "stream": null,
- "system_message": null
- },
- "output_types": [
- "Text"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [
- "max_tokens",
- "model_kwargs",
- "model_name",
- "openai_api_base",
- "openai_api_key",
- "temperature",
- "input_value",
- "system_message",
- "stream"
- ],
- "beta": false
- },
- "id": "OpenAIModel-k39HS",
- "description": "Generates text using OpenAI LLMs.",
- "display_name": "OpenAI"
- },
- "selected": false,
- "width": 384,
- "height": 563,
- "positionAbsolute": {
- "x": 634.8148772766217,
- "y": 27.035057029045305
- },
- "dragging": false
+ "description": "Create a prompt template with dynamic variables.",
+ "icon": "prompts",
+ "is_input": null,
+ "is_output": null,
+ "is_composition": null,
+ "base_classes": ["object", "str", "Text"],
+ "name": "",
+ "display_name": "Prompt",
+ "documentation": "",
+ "custom_fields": {
+ "template": ["user_input"]
},
- {
- "id": "ChatOutput-njtka",
- "type": "genericNode",
- "position": {
- "x": 1193.250417197867,
- "y": 71.88476890163852
- },
- "data": {
- "type": "ChatOutput",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n\n def build(\n self,\n sender: Optional[str] = \"Machine\",\n sender_name: Optional[str] = \"AI\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n record_template: Optional[str] = \"{text}\",\n ) -> Union[Text, Record]:\n return super().build_with_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n record_template=record_template or \"\",\n )\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "input_value": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Message",
- "advanced": false,
- "input_types": [
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "record_template": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "{text}",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "record_template",
- "display_name": "Record Template",
- "advanced": true,
- "dynamic": false,
- "info": "In case of Message being a Record, this template will be used to convert it to text.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "return_record": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "return_record",
- "display_name": "Return Record",
- "advanced": true,
- "dynamic": false,
- "info": "Return the message as a record containing the sender, sender_name, and session_id.",
- "load_from_db": false,
- "title_case": false
- },
- "sender": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "Machine",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "Machine",
- "User"
- ],
- "name": "sender",
- "display_name": "Sender Type",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "sender_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "AI",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "sender_name",
- "display_name": "Sender Name",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "session_id": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "session_id",
- "display_name": "Session ID",
- "advanced": true,
- "dynamic": false,
- "info": "If provided, the message will be stored in the memory.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Display a chat message in the Playground.",
- "icon": "ChatOutput",
- "base_classes": [
- "Record",
- "Text",
- "str",
- "object"
- ],
- "display_name": "Chat Output",
- "documentation": "",
- "custom_fields": {
- "sender": null,
- "sender_name": null,
- "input_value": null,
- "session_id": null,
- "return_record": null,
- "record_template": null
- },
- "output_types": [
- "Text",
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "ChatOutput-njtka"
- },
- "selected": false,
- "width": 384,
- "height": 383,
- "positionAbsolute": {
- "x": 1193.250417197867,
- "y": 71.88476890163852
- },
- "dragging": false
- },
- {
- "id": "ChatInput-P3fgL",
- "type": "genericNode",
- "position": {
- "x": -495.2223093083827,
- "y": -232.56998443685862
- },
- "data": {
- "type": "ChatInput",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n\n def build_config(self):\n build_config = super().build_config()\n build_config[\"input_value\"] = {\n \"input_types\": [],\n \"display_name\": \"Message\",\n \"multiline\": True,\n }\n\n return build_config\n\n def build(\n self,\n sender: Optional[str] = \"User\",\n sender_name: Optional[str] = \"User\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n ) -> Union[Text, Record]:\n return super().build_no_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n )\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "input_value": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Message",
- "advanced": false,
- "input_types": [],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "value": "hi"
- },
- "return_record": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "return_record",
- "display_name": "Return Record",
- "advanced": true,
- "dynamic": false,
- "info": "Return the message as a record containing the sender, sender_name, and session_id.",
- "load_from_db": false,
- "title_case": false
- },
- "sender": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "User",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "Machine",
- "User"
- ],
- "name": "sender",
- "display_name": "Sender Type",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "sender_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "User",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "sender_name",
- "display_name": "Sender Name",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "session_id": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "session_id",
- "display_name": "Session ID",
- "advanced": true,
- "dynamic": false,
- "info": "If provided, the message will be stored in the memory.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Get chat inputs from the Playground.",
- "icon": "ChatInput",
- "base_classes": [
- "object",
- "Record",
- "str",
- "Text"
- ],
- "display_name": "Chat Input",
- "documentation": "",
- "custom_fields": {
- "sender": null,
- "sender_name": null,
- "input_value": null,
- "session_id": null,
- "return_record": null
- },
- "output_types": [
- "Text",
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "ChatInput-P3fgL"
- },
- "selected": false,
- "width": 384,
- "height": 375,
- "positionAbsolute": {
- "x": -495.2223093083827,
- "y": -232.56998443685862
- },
- "dragging": false
- }
- ],
- "edges": [
- {
- "source": "OpenAIModel-k39HS",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153OpenAIModel\u0153,\u0153id\u0153:\u0153OpenAIModel-k39HS\u0153}",
- "target": "ChatOutput-njtka",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153ChatOutput-njtka\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "ChatOutput-njtka",
- "inputTypes": [
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "object",
- "Text",
- "str"
- ],
- "dataType": "OpenAIModel",
- "id": "OpenAIModel-k39HS"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-OpenAIModel-k39HS{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153OpenAIModel\u0153,\u0153id\u0153:\u0153OpenAIModel-k39HS\u0153}-ChatOutput-njtka{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153ChatOutput-njtka\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
- },
- {
- "source": "Prompt-uxBqP",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153str\u0153,\u0153Text\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-uxBqP\u0153}",
- "target": "OpenAIModel-k39HS",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153OpenAIModel-k39HS\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "OpenAIModel-k39HS",
- "inputTypes": [
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "object",
- "str",
- "Text"
- ],
- "dataType": "Prompt",
- "id": "Prompt-uxBqP"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-Prompt-uxBqP{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153str\u0153,\u0153Text\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-uxBqP\u0153}-OpenAIModel-k39HS{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153OpenAIModel-k39HS\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
- },
- {
- "source": "ChatInput-P3fgL",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Record\u0153,\u0153str\u0153,\u0153Text\u0153],\u0153dataType\u0153:\u0153ChatInput\u0153,\u0153id\u0153:\u0153ChatInput-P3fgL\u0153}",
- "target": "Prompt-uxBqP",
- "targetHandle": "{\u0153fieldName\u0153:\u0153user_input\u0153,\u0153id\u0153:\u0153Prompt-uxBqP\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "user_input",
- "id": "Prompt-uxBqP",
- "inputTypes": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "object",
- "Record",
- "str",
- "Text"
- ],
- "dataType": "ChatInput",
- "id": "ChatInput-P3fgL"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-ChatInput-P3fgL{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Record\u0153,\u0153str\u0153,\u0153Text\u0153],\u0153dataType\u0153:\u0153ChatInput\u0153,\u0153id\u0153:\u0153ChatInput-P3fgL\u0153}-Prompt-uxBqP{\u0153fieldName\u0153:\u0153user_input\u0153,\u0153id\u0153:\u0153Prompt-uxBqP\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
- }
- ],
- "viewport": {
- "x": 260.58251815500563,
- "y": 318.2261172111936,
- "zoom": 0.43514115784696294
+ "output_types": ["Text"],
+ "full_path": null,
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false,
+ "error": null
+ },
+ "id": "Prompt-uxBqP",
+ "description": "Create a prompt template with dynamic variables.",
+ "display_name": "Prompt"
+ },
+ "selected": true,
+ "width": 384,
+ "height": 383,
+ "dragging": false,
+ "positionAbsolute": {
+ "x": 53.588791333410654,
+ "y": -107.07318910019967
}
- },
- "description": "This flow will get you experimenting with the basics of the UI, the Chat and the Prompt component. \n\nTry changing the Template in it to see how the model behaves. \nYou can change it to this and a Text Input into the `type_of_person` variable : \"Answer the user as if you were a pirate.\n\nUser: {user_input}\n\nAnswer: \" ",
- "name": "Basic Prompting (Hello, World)",
- "last_tested_version": "1.0.0a4",
- "is_component": false
+ },
+ {
+ "id": "OpenAIModel-k39HS",
+ "type": "genericNode",
+ "position": {
+ "x": 634.8148772766217,
+ "y": 27.035057029045305
+ },
+ "data": {
+ "type": "OpenAIModel",
+ "node": {
+ "template": {
+ "input_value": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Input",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional\n\nfrom langchain_openai import ChatOpenAI\nfrom pydantic.v1 import SecretStr\n\nfrom langflow.base.constants import STREAM_INFO_TEXT\nfrom langflow.base.models.model import LCModelComponent\nfrom langflow.base.models.openai_constants import MODEL_NAMES\nfrom langflow.field_typing import NestedDict, Text\n\n\nclass OpenAIModelComponent(LCModelComponent):\n display_name = \"OpenAI\"\n description = \"Generates text using OpenAI LLMs.\"\n icon = \"OpenAI\"\n\n field_order = [\n \"max_tokens\",\n \"model_kwargs\",\n \"model_name\",\n \"openai_api_base\",\n \"openai_api_key\",\n \"temperature\",\n \"input_value\",\n \"system_message\",\n \"stream\",\n ]\n\n def build_config(self):\n return {\n \"input_value\": {\"display_name\": \"Input\"},\n \"max_tokens\": {\n \"display_name\": \"Max Tokens\",\n \"advanced\": True,\n \"info\": \"The maximum number of tokens to generate. Set to 0 for unlimited tokens.\",\n },\n \"model_kwargs\": {\n \"display_name\": \"Model Kwargs\",\n \"advanced\": True,\n },\n \"model_name\": {\n \"display_name\": \"Model Name\",\n \"advanced\": False,\n \"options\": MODEL_NAMES,\n },\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"advanced\": True,\n \"info\": (\n \"The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\\n\\n\"\n \"You can change this to use other APIs like JinaChat, LocalAI and Prem.\"\n ),\n },\n \"openai_api_key\": {\n \"display_name\": \"OpenAI API Key\",\n \"info\": \"The OpenAI API Key to use for the OpenAI model.\",\n \"advanced\": False,\n \"password\": True,\n },\n \"temperature\": {\n \"display_name\": \"Temperature\",\n \"advanced\": False,\n \"value\": 0.1,\n },\n \"stream\": {\n \"display_name\": \"Stream\",\n \"info\": STREAM_INFO_TEXT,\n \"advanced\": True,\n },\n \"system_message\": {\n \"display_name\": \"System Message\",\n \"info\": \"System message to pass to the model.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n input_value: Text,\n openai_api_key: str,\n temperature: float = 0.1,\n model_name: str = \"gpt-4o\",\n max_tokens: Optional[int] = 256,\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n stream: bool = False,\n system_message: Optional[str] = None,\n ) -> Text:\n if not openai_api_base:\n openai_api_base = \"https://api.openai.com/v1\"\n if openai_api_key:\n api_key = SecretStr(openai_api_key)\n else:\n api_key = None\n\n output = ChatOpenAI(\n max_tokens=max_tokens or None,\n model_kwargs=model_kwargs,\n model=model_name,\n base_url=openai_api_base,\n api_key=api_key,\n temperature=temperature,\n )\n\n return self.get_chat_result(output, stream, input_value, system_message)\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "max_tokens": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 256,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "max_tokens",
+ "display_name": "Max Tokens",
+ "advanced": true,
+ "dynamic": false,
+ "info": "The maximum number of tokens to generate. Set to 0 for unlimited tokens.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "model_kwargs": {
+ "type": "NestedDict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": {},
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "model_kwargs",
+ "display_name": "Model Kwargs",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "model_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "gpt-3.5-turbo",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": [
+ "gpt-4o",
+ "gpt-4-turbo",
+ "gpt-4-turbo-preview",
+ "gpt-3.5-turbo",
+ "gpt-3.5-turbo-0125"
+ ],
+ "name": "model_name",
+ "display_name": "Model Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_base": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "openai_api_base",
+ "display_name": "OpenAI API Base",
+ "advanced": true,
+ "dynamic": false,
+ "info": "The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\n\nYou can change this to use other APIs like JinaChat, LocalAI and Prem.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_key": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "openai_api_key",
+ "display_name": "OpenAI API Key",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The OpenAI API Key to use for the OpenAI model.",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "OPENAI_API_KEY"
+ },
+ "stream": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "stream",
+ "display_name": "Stream",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Stream the response from the model. Streaming works only in Chat.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "system_message": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "system_message",
+ "display_name": "System Message",
+ "advanced": true,
+ "dynamic": false,
+ "info": "System message to pass to the model.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "temperature": {
+ "type": "float",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 0.1,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "temperature",
+ "display_name": "Temperature",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "rangeSpec": {
+ "step_type": "float",
+ "min": -1,
+ "max": 1,
+ "step": 0.1
+ },
+ "load_from_db": false,
+ "title_case": false
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Generates text using OpenAI LLMs.",
+ "icon": "OpenAI",
+ "base_classes": ["object", "Text", "str"],
+ "display_name": "OpenAI",
+ "documentation": "",
+ "custom_fields": {
+ "input_value": null,
+ "openai_api_key": null,
+ "temperature": null,
+ "model_name": null,
+ "max_tokens": null,
+ "model_kwargs": null,
+ "openai_api_base": null,
+ "stream": null,
+ "system_message": null
+ },
+ "output_types": ["Text"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [
+ "max_tokens",
+ "model_kwargs",
+ "model_name",
+ "openai_api_base",
+ "openai_api_key",
+ "temperature",
+ "input_value",
+ "system_message",
+ "stream"
+ ],
+ "beta": false
+ },
+ "id": "OpenAIModel-k39HS",
+ "description": "Generates text using OpenAI LLMs.",
+ "display_name": "OpenAI"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 563,
+ "positionAbsolute": {
+ "x": 634.8148772766217,
+ "y": 27.035057029045305
+ },
+ "dragging": false
+ },
+ {
+ "id": "ChatOutput-njtka",
+ "type": "genericNode",
+ "position": {
+ "x": 1193.250417197867,
+ "y": 71.88476890163852
+ },
+ "data": {
+ "type": "ChatOutput",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n\n def build(\n self,\n sender: Optional[str] = \"Machine\",\n sender_name: Optional[str] = \"AI\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n record_template: Optional[str] = \"{text}\",\n ) -> Union[Text, Record]:\n return super().build_with_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n record_template=record_template or \"\",\n )\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "input_value": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Message",
+ "advanced": false,
+ "input_types": ["Text"],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "record_template": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "{text}",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "record_template",
+ "display_name": "Record Template",
+ "advanced": true,
+ "dynamic": false,
+ "info": "In case of Message being a Record, this template will be used to convert it to text.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "return_record": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "return_record",
+ "display_name": "Return Record",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Return the message as a record containing the sender, sender_name, and session_id.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "sender": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "Machine",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["Machine", "User"],
+ "name": "sender",
+ "display_name": "Sender Type",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "sender_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "AI",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "sender_name",
+ "display_name": "Sender Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "session_id": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "session_id",
+ "display_name": "Session ID",
+ "advanced": true,
+ "dynamic": false,
+ "info": "If provided, the message will be stored in the memory.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Display a chat message in the Playground.",
+ "icon": "ChatOutput",
+ "base_classes": ["Record", "Text", "str", "object"],
+ "display_name": "Chat Output",
+ "documentation": "",
+ "custom_fields": {
+ "sender": null,
+ "sender_name": null,
+ "input_value": null,
+ "session_id": null,
+ "return_record": null,
+ "record_template": null
+ },
+ "output_types": ["Text", "Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "ChatOutput-njtka"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 383,
+ "positionAbsolute": {
+ "x": 1193.250417197867,
+ "y": 71.88476890163852
+ },
+ "dragging": false
+ },
+ {
+ "id": "ChatInput-P3fgL",
+ "type": "genericNode",
+ "position": {
+ "x": -495.2223093083827,
+ "y": -232.56998443685862
+ },
+ "data": {
+ "type": "ChatInput",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n\n def build_config(self):\n build_config = super().build_config()\n build_config[\"input_value\"] = {\n \"input_types\": [],\n \"display_name\": \"Message\",\n \"multiline\": True,\n }\n\n return build_config\n\n def build(\n self,\n sender: Optional[str] = \"User\",\n sender_name: Optional[str] = \"User\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n ) -> Union[Text, Record]:\n return super().build_no_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n )\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "input_value": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Message",
+ "advanced": false,
+ "input_types": [],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "value": "hi"
+ },
+ "return_record": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "return_record",
+ "display_name": "Return Record",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Return the message as a record containing the sender, sender_name, and session_id.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "sender": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "User",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["Machine", "User"],
+ "name": "sender",
+ "display_name": "Sender Type",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "sender_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "User",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "sender_name",
+ "display_name": "Sender Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "session_id": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "session_id",
+ "display_name": "Session ID",
+ "advanced": true,
+ "dynamic": false,
+ "info": "If provided, the message will be stored in the memory.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Get chat inputs from the Playground.",
+ "icon": "ChatInput",
+ "base_classes": ["object", "Record", "str", "Text"],
+ "display_name": "Chat Input",
+ "documentation": "",
+ "custom_fields": {
+ "sender": null,
+ "sender_name": null,
+ "input_value": null,
+ "session_id": null,
+ "return_record": null
+ },
+ "output_types": ["Text", "Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "ChatInput-P3fgL"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 375,
+ "positionAbsolute": {
+ "x": -495.2223093083827,
+ "y": -232.56998443685862
+ },
+ "dragging": false
+ }
+ ],
+ "edges": [
+ {
+ "source": "OpenAIModel-k39HS",
+ "sourceHandle": "{œbaseClassesœ:[œobjectœ,œTextœ,œstrœ],œdataTypeœ:œOpenAIModelœ,œidœ:œOpenAIModel-k39HSœ}",
+ "target": "ChatOutput-njtka",
+ "targetHandle": "{œfieldNameœ:œinput_valueœ,œidœ:œChatOutput-njtkaœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "ChatOutput-njtka",
+ "inputTypes": ["Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["object", "Text", "str"],
+ "dataType": "OpenAIModel",
+ "id": "OpenAIModel-k39HS"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-OpenAIModel-k39HS{œbaseClassesœ:[œobjectœ,œTextœ,œstrœ],œdataTypeœ:œOpenAIModelœ,œidœ:œOpenAIModel-k39HSœ}-ChatOutput-njtka{œfieldNameœ:œinput_valueœ,œidœ:œChatOutput-njtkaœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}"
+ },
+ {
+ "source": "Prompt-uxBqP",
+ "sourceHandle": "{œbaseClassesœ:[œobjectœ,œstrœ,œTextœ],œdataTypeœ:œPromptœ,œidœ:œPrompt-uxBqPœ}",
+ "target": "OpenAIModel-k39HS",
+ "targetHandle": "{œfieldNameœ:œinput_valueœ,œidœ:œOpenAIModel-k39HSœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "OpenAIModel-k39HS",
+ "inputTypes": ["Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["object", "str", "Text"],
+ "dataType": "Prompt",
+ "id": "Prompt-uxBqP"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-Prompt-uxBqP{œbaseClassesœ:[œobjectœ,œstrœ,œTextœ],œdataTypeœ:œPromptœ,œidœ:œPrompt-uxBqPœ}-OpenAIModel-k39HS{œfieldNameœ:œinput_valueœ,œidœ:œOpenAIModel-k39HSœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}"
+ },
+ {
+ "source": "ChatInput-P3fgL",
+ "sourceHandle": "{œbaseClassesœ:[œobjectœ,œRecordœ,œstrœ,œTextœ],œdataTypeœ:œChatInputœ,œidœ:œChatInput-P3fgLœ}",
+ "target": "Prompt-uxBqP",
+ "targetHandle": "{œfieldNameœ:œuser_inputœ,œidœ:œPrompt-uxBqPœ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "user_input",
+ "id": "Prompt-uxBqP",
+ "inputTypes": ["Document", "BaseOutputParser", "Record", "Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["object", "Record", "str", "Text"],
+ "dataType": "ChatInput",
+ "id": "ChatInput-P3fgL"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-ChatInput-P3fgL{œbaseClassesœ:[œobjectœ,œRecordœ,œstrœ,œTextœ],œdataTypeœ:œChatInputœ,œidœ:œChatInput-P3fgLœ}-Prompt-uxBqP{œfieldNameœ:œuser_inputœ,œidœ:œPrompt-uxBqPœ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}"
+ }
+ ],
+ "viewport": {
+ "x": 260.58251815500563,
+ "y": 318.2261172111936,
+ "zoom": 0.43514115784696294
+ }
+ },
+ "description": "This flow will get you experimenting with the basics of the UI, the Chat and the Prompt component. \n\nTry changing the Template in it to see how the model behaves. \nYou can change it to this and a Text Input into the `type_of_person` variable : \"Answer the user as if you were a pirate.\n\nUser: {user_input}\n\nAnswer: \" ",
+ "name": "Basic Prompting (Hello, World)",
+ "last_tested_version": "1.0.0a4",
+ "is_component": false
}
diff --git a/src/backend/base/langflow/initial_setup/starter_projects/Langflow Blog Writter.json b/src/backend/base/langflow/initial_setup/starter_projects/Langflow Blog Writter.json
index bd6013ad5..fcdef4056 100644
--- a/src/backend/base/langflow/initial_setup/starter_projects/Langflow Blog Writter.json
+++ b/src/backend/base/langflow/initial_setup/starter_projects/Langflow Blog Writter.json
@@ -1,1096 +1,987 @@
{
- "id": "6ad5559d-fb66-4fdc-8f98-96f4ac12799d",
- "data": {
- "nodes": [
- {
- "id": "Prompt-Rse03",
- "type": "genericNode",
- "position": {
- "x": 1331.381712783371,
- "y": 535.0279854229713
- },
- "data": {
- "type": "Prompt",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from langchain_core.prompts import PromptTemplate\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Prompt, TemplateField, Text\n\n\nclass PromptComponent(CustomComponent):\n display_name: str = \"Prompt\"\n description: str = \"Create a prompt template with dynamic variables.\"\n icon = \"prompts\"\n\n def build_config(self):\n return {\n \"template\": TemplateField(display_name=\"Template\"),\n \"code\": TemplateField(advanced=True),\n }\n\n def build(\n self,\n template: Prompt,\n **kwargs,\n ) -> Text:\n from langflow.base.prompts.utils import dict_values_to_string\n\n prompt_template = PromptTemplate.from_template(Text(template))\n kwargs = dict_values_to_string(kwargs)\n kwargs = {k: \"\\n\".join(v) if isinstance(v, list) else v for k, v in kwargs.items()}\n try:\n formated_prompt = prompt_template.format(**kwargs)\n except Exception as exc:\n raise ValueError(f\"Error formatting prompt: {exc}\") from exc\n self.status = f'Prompt:\\n\"{formated_prompt}\"'\n return formated_prompt\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "template": {
- "type": "prompt",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "Reference 1:\n\n{reference_1}\n\n---\n\nReference 2:\n\n{reference_2}\n\n---\n\n{instructions}\n\nBlog: \n\n\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "template",
- "display_name": "Template",
- "advanced": false,
- "input_types": [
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "_type": "CustomComponent",
- "reference_1": {
- "field_type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "reference_1",
- "display_name": "reference_1",
- "advanced": false,
- "input_types": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "type": "str"
- },
- "reference_2": {
- "field_type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "reference_2",
- "display_name": "reference_2",
- "advanced": false,
- "input_types": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "type": "str"
- },
- "instructions": {
- "field_type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "instructions",
- "display_name": "instructions",
- "advanced": false,
- "input_types": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "type": "str"
- }
- },
- "description": "Create a prompt template with dynamic variables.",
- "icon": "prompts",
- "is_input": null,
- "is_output": null,
- "is_composition": null,
- "base_classes": [
- "object",
- "Text",
- "str"
- ],
- "name": "",
- "display_name": "Prompt",
- "documentation": "",
- "custom_fields": {
- "template": [
- "reference_1",
- "reference_2",
- "instructions"
- ]
- },
- "output_types": [
- "Text"
- ],
- "full_path": null,
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false,
- "error": null
- },
- "id": "Prompt-Rse03",
- "description": "Create a prompt template with dynamic variables.",
- "display_name": "Prompt"
- },
- "selected": false,
- "width": 384,
- "height": 571,
- "dragging": false,
- "positionAbsolute": {
- "x": 1331.381712783371,
- "y": 535.0279854229713
- }
+ "id": "6ad5559d-fb66-4fdc-8f98-96f4ac12799d",
+ "data": {
+ "nodes": [
+ {
+ "id": "Prompt-Rse03",
+ "type": "genericNode",
+ "position": {
+ "x": 1331.381712783371,
+ "y": 535.0279854229713
+ },
+ "data": {
+ "type": "Prompt",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from langchain_core.prompts import PromptTemplate\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Prompt, TemplateField, Text\n\n\nclass PromptComponent(CustomComponent):\n display_name: str = \"Prompt\"\n description: str = \"Create a prompt template with dynamic variables.\"\n icon = \"prompts\"\n\n def build_config(self):\n return {\n \"template\": TemplateField(display_name=\"Template\"),\n \"code\": TemplateField(advanced=True),\n }\n\n def build(\n self,\n template: Prompt,\n **kwargs,\n ) -> Text:\n from langflow.base.prompts.utils import dict_values_to_string\n\n prompt_template = PromptTemplate.from_template(Text(template))\n kwargs = dict_values_to_string(kwargs)\n kwargs = {k: \"\\n\".join(v) if isinstance(v, list) else v for k, v in kwargs.items()}\n try:\n formated_prompt = prompt_template.format(**kwargs)\n except Exception as exc:\n raise ValueError(f\"Error formatting prompt: {exc}\") from exc\n self.status = f'Prompt:\\n\"{formated_prompt}\"'\n return formated_prompt\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "template": {
+ "type": "prompt",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "Reference 1:\n\n{reference_1}\n\n---\n\nReference 2:\n\n{reference_2}\n\n---\n\n{instructions}\n\nBlog: \n\n\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "template",
+ "display_name": "Template",
+ "advanced": false,
+ "input_types": ["Text"],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "_type": "CustomComponent",
+ "reference_1": {
+ "field_type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "reference_1",
+ "display_name": "reference_1",
+ "advanced": false,
+ "input_types": [
+ "Document",
+ "BaseOutputParser",
+ "Record",
+ "Text"
+ ],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "type": "str"
+ },
+ "reference_2": {
+ "field_type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "reference_2",
+ "display_name": "reference_2",
+ "advanced": false,
+ "input_types": [
+ "Document",
+ "BaseOutputParser",
+ "Record",
+ "Text"
+ ],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "type": "str"
+ },
+ "instructions": {
+ "field_type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "instructions",
+ "display_name": "instructions",
+ "advanced": false,
+ "input_types": [
+ "Document",
+ "BaseOutputParser",
+ "Record",
+ "Text"
+ ],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "type": "str"
+ }
},
- {
- "id": "URL-HYPkR",
- "type": "genericNode",
- "position": {
- "x": 568.2971412887712,
- "y": 700.9983368007821
- },
- "data": {
- "type": "URL",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Any, Dict\n\nfrom langchain_community.document_loaders.web_base import WebBaseLoader\n\nfrom langflow.custom import CustomComponent\nfrom langflow.schema import Record\n\n\nclass URLComponent(CustomComponent):\n display_name = \"URL\"\n description = \"Fetch content from one or more URLs.\"\n icon = \"layout-template\"\n\n def build_config(self) -> Dict[str, Any]:\n return {\n \"urls\": {\"display_name\": \"URL\"},\n }\n\n def build(\n self,\n urls: list[str],\n ) -> list[Record]:\n loader = WebBaseLoader(web_paths=urls)\n docs = loader.load()\n records = self.to_records(docs)\n self.status = records\n return records\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "urls": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "urls",
- "display_name": "URL",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": [
- "https://www.promptingguide.ai/techniques/prompt_chaining"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Fetch content from one or more URLs.",
- "icon": "layout-template",
- "base_classes": [
- "Record"
- ],
- "display_name": "URL",
- "documentation": "",
- "custom_fields": {
- "urls": null
- },
- "output_types": [
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "URL-HYPkR"
- },
- "selected": false,
- "width": 384,
- "height": 281,
- "positionAbsolute": {
- "x": 568.2971412887712,
- "y": 700.9983368007821
- },
- "dragging": false
+ "description": "Create a prompt template with dynamic variables.",
+ "icon": "prompts",
+ "is_input": null,
+ "is_output": null,
+ "is_composition": null,
+ "base_classes": ["object", "Text", "str"],
+ "name": "",
+ "display_name": "Prompt",
+ "documentation": "",
+ "custom_fields": {
+ "template": ["reference_1", "reference_2", "instructions"]
},
- {
- "id": "ChatOutput-JPlxl",
- "type": "genericNode",
- "position": {
- "x": 2503.8617424688505,
- "y": 789.3005578928434
- },
- "data": {
- "type": "ChatOutput",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n\n def build(\n self,\n sender: Optional[str] = \"Machine\",\n sender_name: Optional[str] = \"AI\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n record_template: Optional[str] = \"{text}\",\n ) -> Union[Text, Record]:\n return super().build_with_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n record_template=record_template or \"\",\n )\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "input_value": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Message",
- "advanced": false,
- "input_types": [
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "record_template": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "{text}",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "record_template",
- "display_name": "Record Template",
- "advanced": true,
- "dynamic": false,
- "info": "In case of Message being a Record, this template will be used to convert it to text.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "return_record": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "return_record",
- "display_name": "Return Record",
- "advanced": true,
- "dynamic": false,
- "info": "Return the message as a record containing the sender, sender_name, and session_id.",
- "load_from_db": false,
- "title_case": false
- },
- "sender": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "Machine",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "Machine",
- "User"
- ],
- "name": "sender",
- "display_name": "Sender Type",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "sender_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "AI",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "sender_name",
- "display_name": "Sender Name",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "session_id": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "session_id",
- "display_name": "Session ID",
- "advanced": true,
- "dynamic": false,
- "info": "If provided, the message will be stored in the memory.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Display a chat message in the Playground.",
- "icon": "ChatOutput",
- "base_classes": [
- "Text",
- "Record",
- "object",
- "str"
- ],
- "display_name": "Chat Output",
- "documentation": "",
- "custom_fields": {
- "sender": null,
- "sender_name": null,
- "input_value": null,
- "session_id": null,
- "return_record": null,
- "record_template": null
- },
- "output_types": [
- "Text",
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "ChatOutput-JPlxl"
- },
- "selected": false,
- "width": 384,
- "height": 383
- },
- {
- "id": "OpenAIModel-gi29P",
- "type": "genericNode",
- "position": {
- "x": 1917.7089968570963,
- "y": 575.9186499244129
- },
- "data": {
- "type": "OpenAIModel",
- "node": {
- "template": {
- "input_value": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Input",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional\n\nfrom langchain_openai import ChatOpenAI\nfrom pydantic.v1 import SecretStr\n\nfrom langflow.base.constants import STREAM_INFO_TEXT\nfrom langflow.base.models.model import LCModelComponent\nfrom langflow.base.models.openai_constants import MODEL_NAMES\nfrom langflow.field_typing import NestedDict, Text\n\n\nclass OpenAIModelComponent(LCModelComponent):\n display_name = \"OpenAI\"\n description = \"Generates text using OpenAI LLMs.\"\n icon = \"OpenAI\"\n\n field_order = [\n \"max_tokens\",\n \"model_kwargs\",\n \"model_name\",\n \"openai_api_base\",\n \"openai_api_key\",\n \"temperature\",\n \"input_value\",\n \"system_message\",\n \"stream\",\n ]\n\n def build_config(self):\n return {\n \"input_value\": {\"display_name\": \"Input\"},\n \"max_tokens\": {\n \"display_name\": \"Max Tokens\",\n \"advanced\": True,\n \"info\": \"The maximum number of tokens to generate. Set to 0 for unlimited tokens.\",\n },\n \"model_kwargs\": {\n \"display_name\": \"Model Kwargs\",\n \"advanced\": True,\n },\n \"model_name\": {\n \"display_name\": \"Model Name\",\n \"advanced\": False,\n \"options\": MODEL_NAMES,\n },\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"advanced\": True,\n \"info\": (\n \"The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\\n\\n\"\n \"You can change this to use other APIs like JinaChat, LocalAI and Prem.\"\n ),\n },\n \"openai_api_key\": {\n \"display_name\": \"OpenAI API Key\",\n \"info\": \"The OpenAI API Key to use for the OpenAI model.\",\n \"advanced\": False,\n \"password\": True,\n },\n \"temperature\": {\n \"display_name\": \"Temperature\",\n \"advanced\": False,\n \"value\": 0.1,\n },\n \"stream\": {\n \"display_name\": \"Stream\",\n \"info\": STREAM_INFO_TEXT,\n \"advanced\": True,\n },\n \"system_message\": {\n \"display_name\": \"System Message\",\n \"info\": \"System message to pass to the model.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n input_value: Text,\n openai_api_key: str,\n temperature: float,\n model_name: str = \"gpt-4o\",\n max_tokens: Optional[int] = 256,\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n stream: bool = False,\n system_message: Optional[str] = None,\n ) -> Text:\n if not openai_api_base:\n openai_api_base = \"https://api.openai.com/v1\"\n if openai_api_key:\n api_key = SecretStr(openai_api_key)\n else:\n api_key = None\n\n output = ChatOpenAI(\n max_tokens=max_tokens or None,\n model_kwargs=model_kwargs,\n model=model_name,\n base_url=openai_api_base,\n api_key=api_key,\n temperature=temperature,\n )\n\n return self.get_chat_result(output, stream, input_value, system_message)\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "max_tokens": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "1024",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "max_tokens",
- "display_name": "Max Tokens",
- "advanced": true,
- "dynamic": false,
- "info": "The maximum number of tokens to generate. Set to 0 for unlimited tokens.",
- "load_from_db": false,
- "title_case": false
- },
- "model_kwargs": {
- "type": "NestedDict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": {},
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "model_kwargs",
- "display_name": "Model Kwargs",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "model_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "gpt-3.5-turbo-0125",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "gpt-4o",
- "gpt-4-turbo",
- "gpt-4-turbo-preview",
- "gpt-3.5-turbo",
- "gpt-3.5-turbo-0125"
- ],
- "name": "model_name",
- "display_name": "Model Name",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_base": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "openai_api_base",
- "display_name": "OpenAI API Base",
- "advanced": true,
- "dynamic": false,
- "info": "The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\n\nYou can change this to use other APIs like JinaChat, LocalAI and Prem.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_key": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "openai_api_key",
- "display_name": "OpenAI API Key",
- "advanced": false,
- "dynamic": false,
- "info": "The OpenAI API Key to use for the OpenAI model.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "OPENAI_API_KEY"
- },
- "stream": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": true,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "stream",
- "display_name": "Stream",
- "advanced": true,
- "dynamic": false,
- "info": "Stream the response from the model. Streaming works only in Chat.",
- "load_from_db": false,
- "title_case": false
- },
- "system_message": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "system_message",
- "display_name": "System Message",
- "advanced": true,
- "dynamic": false,
- "info": "System message to pass to the model.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "temperature": {
- "type": "float",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "0.1",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "temperature",
- "display_name": "Temperature",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "rangeSpec": {
- "step_type": "float",
- "min": -1,
- "max": 1,
- "step": 0.1
- },
- "load_from_db": false,
- "title_case": false
- },
- "_type": "CustomComponent"
- },
- "description": "Generates text using OpenAI LLMs.",
- "icon": "OpenAI",
- "base_classes": [
- "str",
- "Text",
- "object"
- ],
- "display_name": "OpenAI",
- "documentation": "",
- "custom_fields": {
- "input_value": null,
- "openai_api_key": null,
- "temperature": null,
- "model_name": null,
- "max_tokens": null,
- "model_kwargs": null,
- "openai_api_base": null,
- "stream": null,
- "system_message": null
- },
- "output_types": [
- "Text"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [
- "max_tokens",
- "model_kwargs",
- "model_name",
- "openai_api_base",
- "openai_api_key",
- "temperature",
- "input_value",
- "system_message",
- "stream"
- ],
- "beta": false
- },
- "id": "OpenAIModel-gi29P"
- },
- "selected": false,
- "width": 384,
- "height": 563,
- "positionAbsolute": {
- "x": 1917.7089968570963,
- "y": 575.9186499244129
- },
- "dragging": false
- },
- {
- "id": "URL-2cX90",
- "type": "genericNode",
- "position": {
- "x": 573.961301764604,
- "y": 336.41463436122086
- },
- "data": {
- "type": "URL",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Any, Dict\n\nfrom langchain_community.document_loaders.web_base import WebBaseLoader\n\nfrom langflow.custom import CustomComponent\nfrom langflow.schema import Record\n\n\nclass URLComponent(CustomComponent):\n display_name = \"URL\"\n description = \"Fetch content from one or more URLs.\"\n icon = \"layout-template\"\n\n def build_config(self) -> Dict[str, Any]:\n return {\n \"urls\": {\"display_name\": \"URL\"},\n }\n\n def build(\n self,\n urls: list[str],\n ) -> list[Record]:\n loader = WebBaseLoader(web_paths=urls)\n docs = loader.load()\n records = self.to_records(docs)\n self.status = records\n return records\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "urls": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "urls",
- "display_name": "URL",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": [
- "https://www.promptingguide.ai/introduction/basics"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Fetch content from one or more URLs.",
- "icon": "layout-template",
- "base_classes": [
- "Record"
- ],
- "display_name": "URL",
- "documentation": "",
- "custom_fields": {
- "urls": null
- },
- "output_types": [
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "URL-2cX90"
- },
- "selected": false,
- "width": 384,
- "height": 281,
- "positionAbsolute": {
- "x": 573.961301764604,
- "y": 336.41463436122086
- },
- "dragging": false
- },
- {
- "id": "TextInput-og8Or",
- "type": "genericNode",
- "position": {
- "x": 569.9387927203336,
- "y": 1095.3352160671316
- },
- "data": {
- "type": "TextInput",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional\n\nfrom langflow.base.io.text import TextComponent\nfrom langflow.field_typing import Text\n\n\nclass TextInput(TextComponent):\n display_name = \"Text Input\"\n description = \"Get text inputs from the Playground.\"\n icon = \"type\"\n\n def build_config(self):\n return {\n \"input_value\": {\n \"display_name\": \"Value\",\n \"input_types\": [\"Record\", \"Text\"],\n \"info\": \"Text or Record to be passed as input.\",\n },\n \"record_template\": {\n \"display_name\": \"Record Template\",\n \"multiline\": True,\n \"info\": \"Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n input_value: Optional[str] = \"\",\n record_template: Optional[str] = \"\",\n ) -> Text:\n return super().build(input_value=input_value, record_template=record_template)\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "input_value": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "Use the references above for style to write a new blog/tutorial about prompt engineering techniques. Suggest non-covered topics.",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Value",
- "advanced": false,
- "input_types": [
- "Record",
- "Text"
- ],
- "dynamic": false,
- "info": "Text or Record to be passed as input.",
- "load_from_db": false,
- "title_case": false
- },
- "record_template": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "record_template",
- "display_name": "Record Template",
- "advanced": true,
- "dynamic": false,
- "info": "Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Get text inputs from the Playground.",
- "icon": "type",
- "base_classes": [
- "object",
- "Text",
- "str"
- ],
- "display_name": "Instructions",
- "documentation": "",
- "custom_fields": {
- "input_value": null,
- "record_template": null
- },
- "output_types": [
- "Text"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "TextInput-og8Or"
- },
- "selected": false,
- "width": 384,
- "height": 289,
- "positionAbsolute": {
- "x": 569.9387927203336,
- "y": 1095.3352160671316
- },
- "dragging": false
- }
- ],
- "edges": [
- {
- "source": "URL-HYPkR",
- "target": "Prompt-Rse03",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153URL\u0153,\u0153id\u0153:\u0153URL-HYPkR\u0153}",
- "targetHandle": "{\u0153fieldName\u0153:\u0153reference_2\u0153,\u0153id\u0153:\u0153Prompt-Rse03\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "id": "reactflow__edge-URL-HYPkR{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153URL\u0153,\u0153id\u0153:\u0153URL-HYPkR\u0153}-Prompt-Rse03{\u0153fieldName\u0153:\u0153reference_2\u0153,\u0153id\u0153:\u0153Prompt-Rse03\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "reference_2",
- "id": "Prompt-Rse03",
- "inputTypes": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "Record"
- ],
- "dataType": "URL",
- "id": "URL-HYPkR"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "selected": false
- },
- {
- "source": "OpenAIModel-gi29P",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153str\u0153,\u0153Text\u0153,\u0153object\u0153],\u0153dataType\u0153:\u0153OpenAIModel\u0153,\u0153id\u0153:\u0153OpenAIModel-gi29P\u0153}",
- "target": "ChatOutput-JPlxl",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153ChatOutput-JPlxl\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "ChatOutput-JPlxl",
- "inputTypes": [
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "str",
- "Text",
- "object"
- ],
- "dataType": "OpenAIModel",
- "id": "OpenAIModel-gi29P"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-OpenAIModel-gi29P{\u0153baseClasses\u0153:[\u0153str\u0153,\u0153Text\u0153,\u0153object\u0153],\u0153dataType\u0153:\u0153OpenAIModel\u0153,\u0153id\u0153:\u0153OpenAIModel-gi29P\u0153}-ChatOutput-JPlxl{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153ChatOutput-JPlxl\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
- },
- {
- "source": "URL-2cX90",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153URL\u0153,\u0153id\u0153:\u0153URL-2cX90\u0153}",
- "target": "Prompt-Rse03",
- "targetHandle": "{\u0153fieldName\u0153:\u0153reference_1\u0153,\u0153id\u0153:\u0153Prompt-Rse03\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "reference_1",
- "id": "Prompt-Rse03",
- "inputTypes": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "Record"
- ],
- "dataType": "URL",
- "id": "URL-2cX90"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-URL-2cX90{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153URL\u0153,\u0153id\u0153:\u0153URL-2cX90\u0153}-Prompt-Rse03{\u0153fieldName\u0153:\u0153reference_1\u0153,\u0153id\u0153:\u0153Prompt-Rse03\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
- },
- {
- "source": "TextInput-og8Or",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153TextInput\u0153,\u0153id\u0153:\u0153TextInput-og8Or\u0153}",
- "target": "Prompt-Rse03",
- "targetHandle": "{\u0153fieldName\u0153:\u0153instructions\u0153,\u0153id\u0153:\u0153Prompt-Rse03\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "instructions",
- "id": "Prompt-Rse03",
- "inputTypes": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "object",
- "Text",
- "str"
- ],
- "dataType": "TextInput",
- "id": "TextInput-og8Or"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-TextInput-og8Or{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153TextInput\u0153,\u0153id\u0153:\u0153TextInput-og8Or\u0153}-Prompt-Rse03{\u0153fieldName\u0153:\u0153instructions\u0153,\u0153id\u0153:\u0153Prompt-Rse03\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
- },
- {
- "source": "Prompt-Rse03",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-Rse03\u0153}",
- "target": "OpenAIModel-gi29P",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153OpenAIModel-gi29P\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "OpenAIModel-gi29P",
- "inputTypes": [
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "object",
- "Text",
- "str"
- ],
- "dataType": "Prompt",
- "id": "Prompt-Rse03"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-Prompt-Rse03{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-Rse03\u0153}-OpenAIModel-gi29P{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153OpenAIModel-gi29P\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "selected": false
- }
- ],
- "viewport": {
- "x": -214.14726025721177,
- "y": -35.83855793844168,
- "zoom": 0.47344308394045925
+ "output_types": ["Text"],
+ "full_path": null,
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false,
+ "error": null
+ },
+ "id": "Prompt-Rse03",
+ "description": "Create a prompt template with dynamic variables.",
+ "display_name": "Prompt"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 571,
+ "dragging": false,
+ "positionAbsolute": {
+ "x": 1331.381712783371,
+ "y": 535.0279854229713
}
- },
- "description": "This flow can be used to create a blog post following instructions from the user, using two other blogs as reference.",
- "name": "Blog Writer",
- "last_tested_version": "1.0.0a0",
- "is_component": false
+ },
+ {
+ "id": "URL-HYPkR",
+ "type": "genericNode",
+ "position": {
+ "x": 568.2971412887712,
+ "y": 700.9983368007821
+ },
+ "data": {
+ "type": "URL",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Any, Dict\n\nfrom langchain_community.document_loaders.web_base import WebBaseLoader\n\nfrom langflow.custom import CustomComponent\nfrom langflow.schema import Record\n\n\nclass URLComponent(CustomComponent):\n display_name = \"URL\"\n description = \"Fetch content from one or more URLs.\"\n icon = \"layout-template\"\n\n def build_config(self) -> Dict[str, Any]:\n return {\n \"urls\": {\"display_name\": \"URL\"},\n }\n\n def build(\n self,\n urls: list[str],\n ) -> list[Record]:\n loader = WebBaseLoader(web_paths=urls)\n docs = loader.load()\n records = self.to_records(docs)\n self.status = records\n return records\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "urls": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "urls",
+ "display_name": "URL",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": [
+ "https://www.promptingguide.ai/techniques/prompt_chaining"
+ ]
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Fetch content from one or more URLs.",
+ "icon": "layout-template",
+ "base_classes": ["Record"],
+ "display_name": "URL",
+ "documentation": "",
+ "custom_fields": {
+ "urls": null
+ },
+ "output_types": ["Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "URL-HYPkR"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 281,
+ "positionAbsolute": {
+ "x": 568.2971412887712,
+ "y": 700.9983368007821
+ },
+ "dragging": false
+ },
+ {
+ "id": "ChatOutput-JPlxl",
+ "type": "genericNode",
+ "position": {
+ "x": 2503.8617424688505,
+ "y": 789.3005578928434
+ },
+ "data": {
+ "type": "ChatOutput",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n\n def build(\n self,\n sender: Optional[str] = \"Machine\",\n sender_name: Optional[str] = \"AI\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n record_template: Optional[str] = \"{text}\",\n ) -> Union[Text, Record]:\n return super().build_with_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n record_template=record_template or \"\",\n )\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "input_value": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Message",
+ "advanced": false,
+ "input_types": ["Text"],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "record_template": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "{text}",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "record_template",
+ "display_name": "Record Template",
+ "advanced": true,
+ "dynamic": false,
+ "info": "In case of Message being a Record, this template will be used to convert it to text.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "return_record": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "return_record",
+ "display_name": "Return Record",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Return the message as a record containing the sender, sender_name, and session_id.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "sender": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "Machine",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["Machine", "User"],
+ "name": "sender",
+ "display_name": "Sender Type",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "sender_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "AI",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "sender_name",
+ "display_name": "Sender Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "session_id": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "session_id",
+ "display_name": "Session ID",
+ "advanced": true,
+ "dynamic": false,
+ "info": "If provided, the message will be stored in the memory.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Display a chat message in the Playground.",
+ "icon": "ChatOutput",
+ "base_classes": ["Text", "Record", "object", "str"],
+ "display_name": "Chat Output",
+ "documentation": "",
+ "custom_fields": {
+ "sender": null,
+ "sender_name": null,
+ "input_value": null,
+ "session_id": null,
+ "return_record": null,
+ "record_template": null
+ },
+ "output_types": ["Text", "Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "ChatOutput-JPlxl"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 383
+ },
+ {
+ "id": "OpenAIModel-gi29P",
+ "type": "genericNode",
+ "position": {
+ "x": 1917.7089968570963,
+ "y": 575.9186499244129
+ },
+ "data": {
+ "type": "OpenAIModel",
+ "node": {
+ "template": {
+ "input_value": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Input",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional\n\nfrom langchain_openai import ChatOpenAI\nfrom pydantic.v1 import SecretStr\n\nfrom langflow.base.constants import STREAM_INFO_TEXT\nfrom langflow.base.models.model import LCModelComponent\nfrom langflow.base.models.openai_constants import MODEL_NAMES\nfrom langflow.field_typing import NestedDict, Text\n\n\nclass OpenAIModelComponent(LCModelComponent):\n display_name = \"OpenAI\"\n description = \"Generates text using OpenAI LLMs.\"\n icon = \"OpenAI\"\n\n field_order = [\n \"max_tokens\",\n \"model_kwargs\",\n \"model_name\",\n \"openai_api_base\",\n \"openai_api_key\",\n \"temperature\",\n \"input_value\",\n \"system_message\",\n \"stream\",\n ]\n\n def build_config(self):\n return {\n \"input_value\": {\"display_name\": \"Input\"},\n \"max_tokens\": {\n \"display_name\": \"Max Tokens\",\n \"advanced\": True,\n \"info\": \"The maximum number of tokens to generate. Set to 0 for unlimited tokens.\",\n },\n \"model_kwargs\": {\n \"display_name\": \"Model Kwargs\",\n \"advanced\": True,\n },\n \"model_name\": {\n \"display_name\": \"Model Name\",\n \"advanced\": False,\n \"options\": MODEL_NAMES,\n },\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"advanced\": True,\n \"info\": (\n \"The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\\n\\n\"\n \"You can change this to use other APIs like JinaChat, LocalAI and Prem.\"\n ),\n },\n \"openai_api_key\": {\n \"display_name\": \"OpenAI API Key\",\n \"info\": \"The OpenAI API Key to use for the OpenAI model.\",\n \"advanced\": False,\n \"password\": True,\n },\n \"temperature\": {\n \"display_name\": \"Temperature\",\n \"advanced\": False,\n \"value\": 0.1,\n },\n \"stream\": {\n \"display_name\": \"Stream\",\n \"info\": STREAM_INFO_TEXT,\n \"advanced\": True,\n },\n \"system_message\": {\n \"display_name\": \"System Message\",\n \"info\": \"System message to pass to the model.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n input_value: Text,\n openai_api_key: str,\n temperature: float = 0.1,\n model_name: str = \"gpt-4o\",\n max_tokens: Optional[int] = 256,\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n stream: bool = False,\n system_message: Optional[str] = None,\n ) -> Text:\n if not openai_api_base:\n openai_api_base = \"https://api.openai.com/v1\"\n if openai_api_key:\n api_key = SecretStr(openai_api_key)\n else:\n api_key = None\n\n output = ChatOpenAI(\n max_tokens=max_tokens or None,\n model_kwargs=model_kwargs,\n model=model_name,\n base_url=openai_api_base,\n api_key=api_key,\n temperature=temperature,\n )\n\n return self.get_chat_result(output, stream, input_value, system_message)\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "max_tokens": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "1024",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "max_tokens",
+ "display_name": "Max Tokens",
+ "advanced": true,
+ "dynamic": false,
+ "info": "The maximum number of tokens to generate. Set to 0 for unlimited tokens.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "model_kwargs": {
+ "type": "NestedDict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": {},
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "model_kwargs",
+ "display_name": "Model Kwargs",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "model_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "gpt-3.5-turbo-0125",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": [
+ "gpt-4o",
+ "gpt-4-turbo",
+ "gpt-4-turbo-preview",
+ "gpt-3.5-turbo",
+ "gpt-3.5-turbo-0125"
+ ],
+ "name": "model_name",
+ "display_name": "Model Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_base": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "openai_api_base",
+ "display_name": "OpenAI API Base",
+ "advanced": true,
+ "dynamic": false,
+ "info": "The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\n\nYou can change this to use other APIs like JinaChat, LocalAI and Prem.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_key": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "openai_api_key",
+ "display_name": "OpenAI API Key",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The OpenAI API Key to use for the OpenAI model.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "OPENAI_API_KEY"
+ },
+ "stream": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "stream",
+ "display_name": "Stream",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Stream the response from the model. Streaming works only in Chat.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "system_message": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "system_message",
+ "display_name": "System Message",
+ "advanced": true,
+ "dynamic": false,
+ "info": "System message to pass to the model.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "temperature": {
+ "type": "float",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "0.1",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "temperature",
+ "display_name": "Temperature",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "rangeSpec": {
+ "step_type": "float",
+ "min": -1,
+ "max": 1,
+ "step": 0.1
+ },
+ "load_from_db": false,
+ "title_case": false
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Generates text using OpenAI LLMs.",
+ "icon": "OpenAI",
+ "base_classes": ["str", "Text", "object"],
+ "display_name": "OpenAI",
+ "documentation": "",
+ "custom_fields": {
+ "input_value": null,
+ "openai_api_key": null,
+ "temperature": null,
+ "model_name": null,
+ "max_tokens": null,
+ "model_kwargs": null,
+ "openai_api_base": null,
+ "stream": null,
+ "system_message": null
+ },
+ "output_types": ["Text"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [
+ "max_tokens",
+ "model_kwargs",
+ "model_name",
+ "openai_api_base",
+ "openai_api_key",
+ "temperature",
+ "input_value",
+ "system_message",
+ "stream"
+ ],
+ "beta": false
+ },
+ "id": "OpenAIModel-gi29P"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 563,
+ "positionAbsolute": {
+ "x": 1917.7089968570963,
+ "y": 575.9186499244129
+ },
+ "dragging": false
+ },
+ {
+ "id": "URL-2cX90",
+ "type": "genericNode",
+ "position": {
+ "x": 573.961301764604,
+ "y": 336.41463436122086
+ },
+ "data": {
+ "type": "URL",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Any, Dict\n\nfrom langchain_community.document_loaders.web_base import WebBaseLoader\n\nfrom langflow.custom import CustomComponent\nfrom langflow.schema import Record\n\n\nclass URLComponent(CustomComponent):\n display_name = \"URL\"\n description = \"Fetch content from one or more URLs.\"\n icon = \"layout-template\"\n\n def build_config(self) -> Dict[str, Any]:\n return {\n \"urls\": {\"display_name\": \"URL\"},\n }\n\n def build(\n self,\n urls: list[str],\n ) -> list[Record]:\n loader = WebBaseLoader(web_paths=urls)\n docs = loader.load()\n records = self.to_records(docs)\n self.status = records\n return records\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "urls": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "urls",
+ "display_name": "URL",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": ["https://www.promptingguide.ai/introduction/basics"]
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Fetch content from one or more URLs.",
+ "icon": "layout-template",
+ "base_classes": ["Record"],
+ "display_name": "URL",
+ "documentation": "",
+ "custom_fields": {
+ "urls": null
+ },
+ "output_types": ["Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "URL-2cX90"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 281,
+ "positionAbsolute": {
+ "x": 573.961301764604,
+ "y": 336.41463436122086
+ },
+ "dragging": false
+ },
+ {
+ "id": "TextInput-og8Or",
+ "type": "genericNode",
+ "position": {
+ "x": 569.9387927203336,
+ "y": 1095.3352160671316
+ },
+ "data": {
+ "type": "TextInput",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional\n\nfrom langflow.base.io.text import TextComponent\nfrom langflow.field_typing import Text\n\n\nclass TextInput(TextComponent):\n display_name = \"Text Input\"\n description = \"Get text inputs from the Playground.\"\n icon = \"type\"\n\n def build_config(self):\n return {\n \"input_value\": {\n \"display_name\": \"Value\",\n \"input_types\": [\"Record\", \"Text\"],\n \"info\": \"Text or Record to be passed as input.\",\n },\n \"record_template\": {\n \"display_name\": \"Record Template\",\n \"multiline\": True,\n \"info\": \"Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n input_value: Optional[str] = \"\",\n record_template: Optional[str] = \"\",\n ) -> Text:\n return super().build(input_value=input_value, record_template=record_template)\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "input_value": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "Use the references above for style to write a new blog/tutorial about prompt engineering techniques. Suggest non-covered topics.",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Value",
+ "advanced": false,
+ "input_types": ["Record", "Text"],
+ "dynamic": false,
+ "info": "Text or Record to be passed as input.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "record_template": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "record_template",
+ "display_name": "Record Template",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Get text inputs from the Playground.",
+ "icon": "type",
+ "base_classes": ["object", "Text", "str"],
+ "display_name": "Instructions",
+ "documentation": "",
+ "custom_fields": {
+ "input_value": null,
+ "record_template": null
+ },
+ "output_types": ["Text"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "TextInput-og8Or"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 289,
+ "positionAbsolute": {
+ "x": 569.9387927203336,
+ "y": 1095.3352160671316
+ },
+ "dragging": false
+ }
+ ],
+ "edges": [
+ {
+ "source": "URL-HYPkR",
+ "target": "Prompt-Rse03",
+ "sourceHandle": "{œbaseClassesœ:[œRecordœ],œdataTypeœ:œURLœ,œidœ:œURL-HYPkRœ}",
+ "targetHandle": "{œfieldNameœ:œreference_2œ,œidœ:œPrompt-Rse03œ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}",
+ "id": "reactflow__edge-URL-HYPkR{œbaseClassesœ:[œRecordœ],œdataTypeœ:œURLœ,œidœ:œURL-HYPkRœ}-Prompt-Rse03{œfieldNameœ:œreference_2œ,œidœ:œPrompt-Rse03œ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "reference_2",
+ "id": "Prompt-Rse03",
+ "inputTypes": ["Document", "BaseOutputParser", "Record", "Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["Record"],
+ "dataType": "URL",
+ "id": "URL-HYPkR"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "selected": false
+ },
+ {
+ "source": "OpenAIModel-gi29P",
+ "sourceHandle": "{œbaseClassesœ:[œstrœ,œTextœ,œobjectœ],œdataTypeœ:œOpenAIModelœ,œidœ:œOpenAIModel-gi29Pœ}",
+ "target": "ChatOutput-JPlxl",
+ "targetHandle": "{œfieldNameœ:œinput_valueœ,œidœ:œChatOutput-JPlxlœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "ChatOutput-JPlxl",
+ "inputTypes": ["Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["str", "Text", "object"],
+ "dataType": "OpenAIModel",
+ "id": "OpenAIModel-gi29P"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-OpenAIModel-gi29P{œbaseClassesœ:[œstrœ,œTextœ,œobjectœ],œdataTypeœ:œOpenAIModelœ,œidœ:œOpenAIModel-gi29Pœ}-ChatOutput-JPlxl{œfieldNameœ:œinput_valueœ,œidœ:œChatOutput-JPlxlœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}"
+ },
+ {
+ "source": "URL-2cX90",
+ "sourceHandle": "{œbaseClassesœ:[œRecordœ],œdataTypeœ:œURLœ,œidœ:œURL-2cX90œ}",
+ "target": "Prompt-Rse03",
+ "targetHandle": "{œfieldNameœ:œreference_1œ,œidœ:œPrompt-Rse03œ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "reference_1",
+ "id": "Prompt-Rse03",
+ "inputTypes": ["Document", "BaseOutputParser", "Record", "Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["Record"],
+ "dataType": "URL",
+ "id": "URL-2cX90"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-URL-2cX90{œbaseClassesœ:[œRecordœ],œdataTypeœ:œURLœ,œidœ:œURL-2cX90œ}-Prompt-Rse03{œfieldNameœ:œreference_1œ,œidœ:œPrompt-Rse03œ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}"
+ },
+ {
+ "source": "TextInput-og8Or",
+ "sourceHandle": "{œbaseClassesœ:[œobjectœ,œTextœ,œstrœ],œdataTypeœ:œTextInputœ,œidœ:œTextInput-og8Orœ}",
+ "target": "Prompt-Rse03",
+ "targetHandle": "{œfieldNameœ:œinstructionsœ,œidœ:œPrompt-Rse03œ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "instructions",
+ "id": "Prompt-Rse03",
+ "inputTypes": ["Document", "BaseOutputParser", "Record", "Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["object", "Text", "str"],
+ "dataType": "TextInput",
+ "id": "TextInput-og8Or"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-TextInput-og8Or{œbaseClassesœ:[œobjectœ,œTextœ,œstrœ],œdataTypeœ:œTextInputœ,œidœ:œTextInput-og8Orœ}-Prompt-Rse03{œfieldNameœ:œinstructionsœ,œidœ:œPrompt-Rse03œ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}"
+ },
+ {
+ "source": "Prompt-Rse03",
+ "sourceHandle": "{œbaseClassesœ:[œobjectœ,œTextœ,œstrœ],œdataTypeœ:œPromptœ,œidœ:œPrompt-Rse03œ}",
+ "target": "OpenAIModel-gi29P",
+ "targetHandle": "{œfieldNameœ:œinput_valueœ,œidœ:œOpenAIModel-gi29Pœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "OpenAIModel-gi29P",
+ "inputTypes": ["Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["object", "Text", "str"],
+ "dataType": "Prompt",
+ "id": "Prompt-Rse03"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-Prompt-Rse03{œbaseClassesœ:[œobjectœ,œTextœ,œstrœ],œdataTypeœ:œPromptœ,œidœ:œPrompt-Rse03œ}-OpenAIModel-gi29P{œfieldNameœ:œinput_valueœ,œidœ:œOpenAIModel-gi29Pœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}",
+ "selected": false
+ }
+ ],
+ "viewport": {
+ "x": -214.14726025721177,
+ "y": -35.83855793844168,
+ "zoom": 0.47344308394045925
+ }
+ },
+ "description": "This flow can be used to create a blog post following instructions from the user, using two other blogs as reference.",
+ "name": "Blog Writer",
+ "last_tested_version": "1.0.0a0",
+ "is_component": false
}
diff --git a/src/backend/base/langflow/initial_setup/starter_projects/Langflow Document QA.json b/src/backend/base/langflow/initial_setup/starter_projects/Langflow Document QA.json
index b7228b9e7..5fa8547a1 100644
--- a/src/backend/base/langflow/initial_setup/starter_projects/Langflow Document QA.json
+++ b/src/backend/base/langflow/initial_setup/starter_projects/Langflow Document QA.json
@@ -1,1029 +1,933 @@
{
- "id": "fecbce42-6f11-454c-8ab2-db6eddbbbb0f",
- "data": {
- "nodes": [
- {
- "id": "Prompt-tHwPf",
- "type": "genericNode",
- "position": {
- "x": 585.7906101139403,
- "y": 117.52115876762832
- },
- "data": {
- "type": "Prompt",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from langchain_core.prompts import PromptTemplate\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Prompt, TemplateField, Text\n\n\nclass PromptComponent(CustomComponent):\n display_name: str = \"Prompt\"\n description: str = \"Create a prompt template with dynamic variables.\"\n icon = \"prompts\"\n\n def build_config(self):\n return {\n \"template\": TemplateField(display_name=\"Template\"),\n \"code\": TemplateField(advanced=True),\n }\n\n def build(\n self,\n template: Prompt,\n **kwargs,\n ) -> Text:\n from langflow.base.prompts.utils import dict_values_to_string\n\n prompt_template = PromptTemplate.from_template(Text(template))\n kwargs = dict_values_to_string(kwargs)\n kwargs = {k: \"\\n\".join(v) if isinstance(v, list) else v for k, v in kwargs.items()}\n try:\n formated_prompt = prompt_template.format(**kwargs)\n except Exception as exc:\n raise ValueError(f\"Error formatting prompt: {exc}\") from exc\n self.status = f'Prompt:\\n\"{formated_prompt}\"'\n return formated_prompt\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "template": {
- "type": "prompt",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "Answer user's questions based on the document below:\n\n---\n\n{Document}\n\n---\n\nQuestion:\n{Question}\n\nAnswer:\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "template",
- "display_name": "Template",
- "advanced": false,
- "input_types": [
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "_type": "CustomComponent",
- "Document": {
- "field_type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "Document",
- "display_name": "Document",
- "advanced": false,
- "input_types": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "type": "str"
- },
- "Question": {
- "field_type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "Question",
- "display_name": "Question",
- "advanced": false,
- "input_types": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "type": "str"
- }
- },
- "description": "Create a prompt template with dynamic variables.",
- "icon": "prompts",
- "is_input": null,
- "is_output": null,
- "is_composition": null,
- "base_classes": [
- "object",
- "str",
- "Text"
- ],
- "name": "",
- "display_name": "Prompt",
- "documentation": "",
- "custom_fields": {
- "template": [
- "Document",
- "Question"
- ]
- },
- "output_types": [
- "Text"
- ],
- "full_path": null,
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false,
- "error": null
- },
- "id": "Prompt-tHwPf",
- "description": "A component for creating prompt templates using dynamic variables.",
- "display_name": "Prompt"
- },
- "selected": false,
- "width": 384,
- "height": 479,
- "positionAbsolute": {
- "x": 585.7906101139403,
- "y": 117.52115876762832
- },
- "dragging": false
+ "id": "fecbce42-6f11-454c-8ab2-db6eddbbbb0f",
+ "data": {
+ "nodes": [
+ {
+ "id": "Prompt-tHwPf",
+ "type": "genericNode",
+ "position": {
+ "x": 585.7906101139403,
+ "y": 117.52115876762832
+ },
+ "data": {
+ "type": "Prompt",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from langchain_core.prompts import PromptTemplate\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Prompt, TemplateField, Text\n\n\nclass PromptComponent(CustomComponent):\n display_name: str = \"Prompt\"\n description: str = \"Create a prompt template with dynamic variables.\"\n icon = \"prompts\"\n\n def build_config(self):\n return {\n \"template\": TemplateField(display_name=\"Template\"),\n \"code\": TemplateField(advanced=True),\n }\n\n def build(\n self,\n template: Prompt,\n **kwargs,\n ) -> Text:\n from langflow.base.prompts.utils import dict_values_to_string\n\n prompt_template = PromptTemplate.from_template(Text(template))\n kwargs = dict_values_to_string(kwargs)\n kwargs = {k: \"\\n\".join(v) if isinstance(v, list) else v for k, v in kwargs.items()}\n try:\n formated_prompt = prompt_template.format(**kwargs)\n except Exception as exc:\n raise ValueError(f\"Error formatting prompt: {exc}\") from exc\n self.status = f'Prompt:\\n\"{formated_prompt}\"'\n return formated_prompt\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "template": {
+ "type": "prompt",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "Answer user's questions based on the document below:\n\n---\n\n{Document}\n\n---\n\nQuestion:\n{Question}\n\nAnswer:\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "template",
+ "display_name": "Template",
+ "advanced": false,
+ "input_types": ["Text"],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "_type": "CustomComponent",
+ "Document": {
+ "field_type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "Document",
+ "display_name": "Document",
+ "advanced": false,
+ "input_types": [
+ "Document",
+ "BaseOutputParser",
+ "Record",
+ "Text"
+ ],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "type": "str"
+ },
+ "Question": {
+ "field_type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "Question",
+ "display_name": "Question",
+ "advanced": false,
+ "input_types": [
+ "Document",
+ "BaseOutputParser",
+ "Record",
+ "Text"
+ ],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "type": "str"
+ }
},
- {
- "id": "File-6TEsD",
- "type": "genericNode",
- "position": {
- "x": -18.636536329280602,
- "y": 3.951948774836353
- },
- "data": {
- "type": "File",
- "node": {
- "template": {
- "path": {
- "type": "file",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [
- ".txt",
- ".md",
- ".mdx",
- ".csv",
- ".json",
- ".yaml",
- ".yml",
- ".xml",
- ".html",
- ".htm",
- ".pdf",
- ".docx"
- ],
- "password": false,
- "name": "path",
- "display_name": "Path",
- "advanced": false,
- "dynamic": false,
- "info": "Supported file types: txt, md, mdx, csv, json, yaml, yml, xml, html, htm, pdf, docx",
- "load_from_db": false,
- "title_case": false,
- "value": ""
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from pathlib import Path\nfrom typing import Any, Dict\n\nfrom langflow.base.data.utils import TEXT_FILE_TYPES, parse_text_file_to_record\nfrom langflow.interface.custom.custom_component import CustomComponent\nfrom langflow.schema import Record\n\n\nclass FileComponent(CustomComponent):\n display_name = \"Files\"\n description = \"A generic file loader.\"\n\n def build_config(self) -> Dict[str, Any]:\n return {\n \"path\": {\n \"display_name\": \"Path\",\n \"field_type\": \"file\",\n \"file_types\": TEXT_FILE_TYPES,\n \"info\": f\"Supported file types: {', '.join(TEXT_FILE_TYPES)}\",\n },\n \"silent_errors\": {\n \"display_name\": \"Silent Errors\",\n \"advanced\": True,\n \"info\": \"If true, errors will not raise an exception.\",\n },\n }\n\n def load_file(self, path: str, silent_errors: bool = False) -> Record:\n resolved_path = self.resolve_path(path)\n path_obj = Path(resolved_path)\n extension = path_obj.suffix[1:].lower()\n if extension == \"doc\":\n raise ValueError(\"doc files are not supported. Please save as .docx\")\n if extension not in TEXT_FILE_TYPES:\n raise ValueError(f\"Unsupported file type: {extension}\")\n record = parse_text_file_to_record(resolved_path, silent_errors)\n self.status = record if record else \"No data\"\n return record or Record()\n\n def build(\n self,\n path: str,\n silent_errors: bool = False,\n ) -> Record:\n record = self.load_file(path, silent_errors)\n self.status = record\n return record\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "silent_errors": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "silent_errors",
- "display_name": "Silent Errors",
- "advanced": true,
- "dynamic": false,
- "info": "If true, errors will not raise an exception.",
- "load_from_db": false,
- "title_case": false
- },
- "_type": "CustomComponent"
- },
- "description": "A generic file loader.",
- "base_classes": [
- "Record"
- ],
- "display_name": "Files",
- "documentation": "",
- "custom_fields": {
- "path": null,
- "silent_errors": null
- },
- "output_types": [
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "File-6TEsD"
- },
- "selected": false,
- "width": 384,
- "height": 282,
- "positionAbsolute": {
- "x": -18.636536329280602,
- "y": 3.951948774836353
- },
- "dragging": false
+ "description": "Create a prompt template with dynamic variables.",
+ "icon": "prompts",
+ "is_input": null,
+ "is_output": null,
+ "is_composition": null,
+ "base_classes": ["object", "str", "Text"],
+ "name": "",
+ "display_name": "Prompt",
+ "documentation": "",
+ "custom_fields": {
+ "template": ["Document", "Question"]
},
- {
- "id": "ChatInput-MsSJ9",
- "type": "genericNode",
- "position": {
- "x": -28.80036300619821,
- "y": 379.81180230285355
- },
- "data": {
- "type": "ChatInput",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n\n def build_config(self):\n build_config = super().build_config()\n build_config[\"input_value\"] = {\n \"input_types\": [],\n \"display_name\": \"Message\",\n \"multiline\": True,\n }\n\n return build_config\n\n def build(\n self,\n sender: Optional[str] = \"User\",\n sender_name: Optional[str] = \"User\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n ) -> Union[Text, Record]:\n return super().build_no_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n )\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "input_value": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Message",
- "advanced": false,
- "input_types": [],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "value": ""
- },
- "return_record": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "return_record",
- "display_name": "Return Record",
- "advanced": true,
- "dynamic": false,
- "info": "Return the message as a record containing the sender, sender_name, and session_id.",
- "load_from_db": false,
- "title_case": false
- },
- "sender": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "User",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "Machine",
- "User"
- ],
- "name": "sender",
- "display_name": "Sender Type",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "sender_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "User",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "sender_name",
- "display_name": "Sender Name",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "session_id": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "session_id",
- "display_name": "Session ID",
- "advanced": true,
- "dynamic": false,
- "info": "If provided, the message will be stored in the memory.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Get chat inputs from the Playground.",
- "icon": "ChatInput",
- "base_classes": [
- "str",
- "Record",
- "Text",
- "object"
- ],
- "display_name": "Chat Input",
- "documentation": "",
- "custom_fields": {
- "sender": null,
- "sender_name": null,
- "input_value": null,
- "session_id": null,
- "return_record": null
- },
- "output_types": [
- "Text",
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "ChatInput-MsSJ9"
- },
- "selected": true,
- "width": 384,
- "height": 377,
- "positionAbsolute": {
- "x": -28.80036300619821,
- "y": 379.81180230285355
- },
- "dragging": false
+ "output_types": ["Text"],
+ "full_path": null,
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false,
+ "error": null
+ },
+ "id": "Prompt-tHwPf",
+ "description": "A component for creating prompt templates using dynamic variables.",
+ "display_name": "Prompt"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 479,
+ "positionAbsolute": {
+ "x": 585.7906101139403,
+ "y": 117.52115876762832
+ },
+ "dragging": false
+ },
+ {
+ "id": "File-6TEsD",
+ "type": "genericNode",
+ "position": {
+ "x": -18.636536329280602,
+ "y": 3.951948774836353
+ },
+ "data": {
+ "type": "File",
+ "node": {
+ "template": {
+ "path": {
+ "type": "file",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [
+ ".txt",
+ ".md",
+ ".mdx",
+ ".csv",
+ ".json",
+ ".yaml",
+ ".yml",
+ ".xml",
+ ".html",
+ ".htm",
+ ".pdf",
+ ".docx"
+ ],
+ "password": false,
+ "name": "path",
+ "display_name": "Path",
+ "advanced": false,
+ "dynamic": false,
+ "info": "Supported file types: txt, md, mdx, csv, json, yaml, yml, xml, html, htm, pdf, docx",
+ "load_from_db": false,
+ "title_case": false,
+ "value": ""
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from pathlib import Path\nfrom typing import Any, Dict\n\nfrom langflow.base.data.utils import TEXT_FILE_TYPES, parse_text_file_to_record\nfrom langflow.interface.custom.custom_component import CustomComponent\nfrom langflow.schema import Record\n\n\nclass FileComponent(CustomComponent):\n display_name = \"Files\"\n description = \"A generic file loader.\"\n\n def build_config(self) -> Dict[str, Any]:\n return {\n \"path\": {\n \"display_name\": \"Path\",\n \"field_type\": \"file\",\n \"file_types\": TEXT_FILE_TYPES,\n \"info\": f\"Supported file types: {', '.join(TEXT_FILE_TYPES)}\",\n },\n \"silent_errors\": {\n \"display_name\": \"Silent Errors\",\n \"advanced\": True,\n \"info\": \"If true, errors will not raise an exception.\",\n },\n }\n\n def load_file(self, path: str, silent_errors: bool = False) -> Record:\n resolved_path = self.resolve_path(path)\n path_obj = Path(resolved_path)\n extension = path_obj.suffix[1:].lower()\n if extension == \"doc\":\n raise ValueError(\"doc files are not supported. Please save as .docx\")\n if extension not in TEXT_FILE_TYPES:\n raise ValueError(f\"Unsupported file type: {extension}\")\n record = parse_text_file_to_record(resolved_path, silent_errors)\n self.status = record if record else \"No data\"\n return record or Record()\n\n def build(\n self,\n path: str,\n silent_errors: bool = False,\n ) -> Record:\n record = self.load_file(path, silent_errors)\n self.status = record\n return record\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "silent_errors": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "silent_errors",
+ "display_name": "Silent Errors",
+ "advanced": true,
+ "dynamic": false,
+ "info": "If true, errors will not raise an exception.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "_type": "CustomComponent"
},
- {
- "id": "ChatOutput-F5Awj",
- "type": "genericNode",
- "position": {
- "x": 1733.3012915204283,
- "y": 168.76098809939327
- },
- "data": {
- "type": "ChatOutput",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n\n def build(\n self,\n sender: Optional[str] = \"Machine\",\n sender_name: Optional[str] = \"AI\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n record_template: Optional[str] = \"{text}\",\n ) -> Union[Text, Record]:\n return super().build_with_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n record_template=record_template or \"\",\n )\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "input_value": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Message",
- "advanced": false,
- "input_types": [
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "return_record": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "return_record",
- "display_name": "Return Record",
- "advanced": true,
- "dynamic": false,
- "info": "Return the message as a record containing the sender, sender_name, and session_id.",
- "load_from_db": false,
- "title_case": false
- },
- "sender": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "Machine",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "Machine",
- "User"
- ],
- "name": "sender",
- "display_name": "Sender Type",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "sender_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "AI",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "sender_name",
- "display_name": "Sender Name",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "session_id": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "session_id",
- "display_name": "Session ID",
- "advanced": true,
- "dynamic": false,
- "info": "If provided, the message will be stored in the memory.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Display a chat message in the Playground.",
- "icon": "ChatOutput",
- "base_classes": [
- "str",
- "Record",
- "Text",
- "object"
- ],
- "display_name": "Chat Output",
- "documentation": "",
- "custom_fields": {
- "sender": null,
- "sender_name": null,
- "input_value": null,
- "session_id": null,
- "return_record": null
- },
- "output_types": [
- "Text",
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "ChatOutput-F5Awj"
- },
- "selected": false,
- "width": 384,
- "height": 385,
- "positionAbsolute": {
- "x": 1733.3012915204283,
- "y": 168.76098809939327
- },
- "dragging": false
+ "description": "A generic file loader.",
+ "base_classes": ["Record"],
+ "display_name": "Files",
+ "documentation": "",
+ "custom_fields": {
+ "path": null,
+ "silent_errors": null
},
- {
- "id": "OpenAIModel-Bt067",
- "type": "genericNode",
- "position": {
- "x": 1137.6078582863759,
- "y": -14.41920034020356
- },
- "data": {
- "type": "OpenAIModel",
- "node": {
- "template": {
- "input_value": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Input",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional\n\nfrom langchain_openai import ChatOpenAI\nfrom pydantic.v1 import SecretStr\n\nfrom langflow.base.constants import STREAM_INFO_TEXT\nfrom langflow.base.models.model import LCModelComponent\nfrom langflow.base.models.openai_constants import MODEL_NAMES\nfrom langflow.field_typing import NestedDict, Text\n\n\nclass OpenAIModelComponent(LCModelComponent):\n display_name = \"OpenAI\"\n description = \"Generates text using OpenAI LLMs.\"\n icon = \"OpenAI\"\n\n field_order = [\n \"max_tokens\",\n \"model_kwargs\",\n \"model_name\",\n \"openai_api_base\",\n \"openai_api_key\",\n \"temperature\",\n \"input_value\",\n \"system_message\",\n \"stream\",\n ]\n\n def build_config(self):\n return {\n \"input_value\": {\"display_name\": \"Input\"},\n \"max_tokens\": {\n \"display_name\": \"Max Tokens\",\n \"advanced\": True,\n \"info\": \"The maximum number of tokens to generate. Set to 0 for unlimited tokens.\",\n },\n \"model_kwargs\": {\n \"display_name\": \"Model Kwargs\",\n \"advanced\": True,\n },\n \"model_name\": {\n \"display_name\": \"Model Name\",\n \"advanced\": False,\n \"options\": MODEL_NAMES,\n },\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"advanced\": True,\n \"info\": (\n \"The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\\n\\n\"\n \"You can change this to use other APIs like JinaChat, LocalAI and Prem.\"\n ),\n },\n \"openai_api_key\": {\n \"display_name\": \"OpenAI API Key\",\n \"info\": \"The OpenAI API Key to use for the OpenAI model.\",\n \"advanced\": False,\n \"password\": True,\n },\n \"temperature\": {\n \"display_name\": \"Temperature\",\n \"advanced\": False,\n \"value\": 0.1,\n },\n \"stream\": {\n \"display_name\": \"Stream\",\n \"info\": STREAM_INFO_TEXT,\n \"advanced\": True,\n },\n \"system_message\": {\n \"display_name\": \"System Message\",\n \"info\": \"System message to pass to the model.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n input_value: Text,\n openai_api_key: str,\n temperature: float,\n model_name: str = \"gpt-4o\",\n max_tokens: Optional[int] = 256,\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n stream: bool = False,\n system_message: Optional[str] = None,\n ) -> Text:\n if not openai_api_base:\n openai_api_base = \"https://api.openai.com/v1\"\n if openai_api_key:\n api_key = SecretStr(openai_api_key)\n else:\n api_key = None\n\n output = ChatOpenAI(\n max_tokens=max_tokens or None,\n model_kwargs=model_kwargs,\n model=model_name,\n base_url=openai_api_base,\n api_key=api_key,\n temperature=temperature,\n )\n\n return self.get_chat_result(output, stream, input_value, system_message)\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "max_tokens": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 256,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "max_tokens",
- "display_name": "Max Tokens",
- "advanced": true,
- "dynamic": false,
- "info": "The maximum number of tokens to generate. Set to 0 for unlimited tokens.",
- "load_from_db": false,
- "title_case": false
- },
- "model_kwargs": {
- "type": "NestedDict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": {},
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "model_kwargs",
- "display_name": "Model Kwargs",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "model_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "gpt-4-turbo-preview",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "gpt-4o",
- "gpt-4-turbo",
- "gpt-4-turbo-preview",
- "gpt-3.5-turbo",
- "gpt-3.5-turbo-0125"
- ],
- "name": "model_name",
- "display_name": "Model Name",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_base": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "openai_api_base",
- "display_name": "OpenAI API Base",
- "advanced": true,
- "dynamic": false,
- "info": "The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\n\nYou can change this to use other APIs like JinaChat, LocalAI and Prem.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_key": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "openai_api_key",
- "display_name": "OpenAI API Key",
- "advanced": false,
- "dynamic": false,
- "info": "The OpenAI API Key to use for the OpenAI model.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "OPENAI_API_KEY"
- },
- "stream": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": true,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "stream",
- "display_name": "Stream",
- "advanced": false,
- "dynamic": false,
- "info": "Stream the response from the model. Streaming works only in Chat.",
- "load_from_db": false,
- "title_case": false
- },
- "system_message": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "system_message",
- "display_name": "System Message",
- "advanced": true,
- "dynamic": false,
- "info": "System message to pass to the model.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "temperature": {
- "type": "float",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 0.1,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "temperature",
- "display_name": "Temperature",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "rangeSpec": {
- "step_type": "float",
- "min": -1,
- "max": 1,
- "step": 0.1
- },
- "load_from_db": false,
- "title_case": false
- },
- "_type": "CustomComponent"
- },
- "description": "Generates text using OpenAI LLMs.",
- "icon": "OpenAI",
- "base_classes": [
- "object",
- "str",
- "Text"
- ],
- "display_name": "OpenAI",
- "documentation": "",
- "custom_fields": {
- "input_value": null,
- "openai_api_key": null,
- "temperature": null,
- "model_name": null,
- "max_tokens": null,
- "model_kwargs": null,
- "openai_api_base": null,
- "stream": null,
- "system_message": null
- },
- "output_types": [
- "Text"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [
- "max_tokens",
- "model_kwargs",
- "model_name",
- "openai_api_base",
- "openai_api_key",
- "temperature",
- "input_value",
- "system_message",
- "stream"
- ],
- "beta": false
- },
- "id": "OpenAIModel-Bt067"
- },
- "selected": false,
- "width": 384,
- "height": 642,
- "positionAbsolute": {
- "x": 1137.6078582863759,
- "y": -14.41920034020356
- },
- "dragging": false
- }
- ],
- "edges": [
- {
- "source": "ChatInput-MsSJ9",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153str\u0153,\u0153Record\u0153,\u0153Text\u0153,\u0153object\u0153],\u0153dataType\u0153:\u0153ChatInput\u0153,\u0153id\u0153:\u0153ChatInput-MsSJ9\u0153}",
- "target": "Prompt-tHwPf",
- "targetHandle": "{\u0153fieldName\u0153:\u0153Question\u0153,\u0153id\u0153:\u0153Prompt-tHwPf\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "Question",
- "id": "Prompt-tHwPf",
- "inputTypes": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "str",
- "Record",
- "Text",
- "object"
- ],
- "dataType": "ChatInput",
- "id": "ChatInput-MsSJ9"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-ChatInput-MsSJ9{\u0153baseClasses\u0153:[\u0153str\u0153,\u0153Record\u0153,\u0153Text\u0153,\u0153object\u0153],\u0153dataType\u0153:\u0153ChatInput\u0153,\u0153id\u0153:\u0153ChatInput-MsSJ9\u0153}-Prompt-tHwPf{\u0153fieldName\u0153:\u0153Question\u0153,\u0153id\u0153:\u0153Prompt-tHwPf\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
+ "output_types": ["Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "File-6TEsD"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 282,
+ "positionAbsolute": {
+ "x": -18.636536329280602,
+ "y": 3.951948774836353
+ },
+ "dragging": false
+ },
+ {
+ "id": "ChatInput-MsSJ9",
+ "type": "genericNode",
+ "position": {
+ "x": -28.80036300619821,
+ "y": 379.81180230285355
+ },
+ "data": {
+ "type": "ChatInput",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n\n def build_config(self):\n build_config = super().build_config()\n build_config[\"input_value\"] = {\n \"input_types\": [],\n \"display_name\": \"Message\",\n \"multiline\": True,\n }\n\n return build_config\n\n def build(\n self,\n sender: Optional[str] = \"User\",\n sender_name: Optional[str] = \"User\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n ) -> Union[Text, Record]:\n return super().build_no_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n )\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "input_value": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Message",
+ "advanced": false,
+ "input_types": [],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "value": ""
+ },
+ "return_record": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "return_record",
+ "display_name": "Return Record",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Return the message as a record containing the sender, sender_name, and session_id.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "sender": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "User",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["Machine", "User"],
+ "name": "sender",
+ "display_name": "Sender Type",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "sender_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "User",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "sender_name",
+ "display_name": "Sender Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "session_id": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "session_id",
+ "display_name": "Session ID",
+ "advanced": true,
+ "dynamic": false,
+ "info": "If provided, the message will be stored in the memory.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
},
- {
- "source": "File-6TEsD",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153File\u0153,\u0153id\u0153:\u0153File-6TEsD\u0153}",
- "target": "Prompt-tHwPf",
- "targetHandle": "{\u0153fieldName\u0153:\u0153Document\u0153,\u0153id\u0153:\u0153Prompt-tHwPf\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "Document",
- "id": "Prompt-tHwPf",
- "inputTypes": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "Record"
- ],
- "dataType": "File",
- "id": "File-6TEsD"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-File-6TEsD{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153File\u0153,\u0153id\u0153:\u0153File-6TEsD\u0153}-Prompt-tHwPf{\u0153fieldName\u0153:\u0153Document\u0153,\u0153id\u0153:\u0153Prompt-tHwPf\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
+ "description": "Get chat inputs from the Playground.",
+ "icon": "ChatInput",
+ "base_classes": ["str", "Record", "Text", "object"],
+ "display_name": "Chat Input",
+ "documentation": "",
+ "custom_fields": {
+ "sender": null,
+ "sender_name": null,
+ "input_value": null,
+ "session_id": null,
+ "return_record": null
},
- {
- "source": "Prompt-tHwPf",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153str\u0153,\u0153Text\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-tHwPf\u0153}",
- "target": "OpenAIModel-Bt067",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153OpenAIModel-Bt067\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "OpenAIModel-Bt067",
- "inputTypes": [
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "object",
- "str",
- "Text"
- ],
- "dataType": "Prompt",
- "id": "Prompt-tHwPf"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-Prompt-tHwPf{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153str\u0153,\u0153Text\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-tHwPf\u0153}-OpenAIModel-Bt067{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153OpenAIModel-Bt067\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
+ "output_types": ["Text", "Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "ChatInput-MsSJ9"
+ },
+ "selected": true,
+ "width": 384,
+ "height": 377,
+ "positionAbsolute": {
+ "x": -28.80036300619821,
+ "y": 379.81180230285355
+ },
+ "dragging": false
+ },
+ {
+ "id": "ChatOutput-F5Awj",
+ "type": "genericNode",
+ "position": {
+ "x": 1733.3012915204283,
+ "y": 168.76098809939327
+ },
+ "data": {
+ "type": "ChatOutput",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n\n def build(\n self,\n sender: Optional[str] = \"Machine\",\n sender_name: Optional[str] = \"AI\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n record_template: Optional[str] = \"{text}\",\n ) -> Union[Text, Record]:\n return super().build_with_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n record_template=record_template or \"\",\n )\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "input_value": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Message",
+ "advanced": false,
+ "input_types": ["Text"],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "return_record": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "return_record",
+ "display_name": "Return Record",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Return the message as a record containing the sender, sender_name, and session_id.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "sender": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "Machine",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["Machine", "User"],
+ "name": "sender",
+ "display_name": "Sender Type",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "sender_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "AI",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "sender_name",
+ "display_name": "Sender Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "session_id": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "session_id",
+ "display_name": "Session ID",
+ "advanced": true,
+ "dynamic": false,
+ "info": "If provided, the message will be stored in the memory.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
},
- {
- "source": "OpenAIModel-Bt067",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153str\u0153,\u0153Text\u0153],\u0153dataType\u0153:\u0153OpenAIModel\u0153,\u0153id\u0153:\u0153OpenAIModel-Bt067\u0153}",
- "target": "ChatOutput-F5Awj",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153ChatOutput-F5Awj\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "ChatOutput-F5Awj",
- "inputTypes": [
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "object",
- "str",
- "Text"
- ],
- "dataType": "OpenAIModel",
- "id": "OpenAIModel-Bt067"
- }
+ "description": "Display a chat message in the Playground.",
+ "icon": "ChatOutput",
+ "base_classes": ["str", "Record", "Text", "object"],
+ "display_name": "Chat Output",
+ "documentation": "",
+ "custom_fields": {
+ "sender": null,
+ "sender_name": null,
+ "input_value": null,
+ "session_id": null,
+ "return_record": null
+ },
+ "output_types": ["Text", "Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "ChatOutput-F5Awj"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 385,
+ "positionAbsolute": {
+ "x": 1733.3012915204283,
+ "y": 168.76098809939327
+ },
+ "dragging": false
+ },
+ {
+ "id": "OpenAIModel-Bt067",
+ "type": "genericNode",
+ "position": {
+ "x": 1137.6078582863759,
+ "y": -14.41920034020356
+ },
+ "data": {
+ "type": "OpenAIModel",
+ "node": {
+ "template": {
+ "input_value": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Input",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional\n\nfrom langchain_openai import ChatOpenAI\nfrom pydantic.v1 import SecretStr\n\nfrom langflow.base.constants import STREAM_INFO_TEXT\nfrom langflow.base.models.model import LCModelComponent\nfrom langflow.base.models.openai_constants import MODEL_NAMES\nfrom langflow.field_typing import NestedDict, Text\n\n\nclass OpenAIModelComponent(LCModelComponent):\n display_name = \"OpenAI\"\n description = \"Generates text using OpenAI LLMs.\"\n icon = \"OpenAI\"\n\n field_order = [\n \"max_tokens\",\n \"model_kwargs\",\n \"model_name\",\n \"openai_api_base\",\n \"openai_api_key\",\n \"temperature\",\n \"input_value\",\n \"system_message\",\n \"stream\",\n ]\n\n def build_config(self):\n return {\n \"input_value\": {\"display_name\": \"Input\"},\n \"max_tokens\": {\n \"display_name\": \"Max Tokens\",\n \"advanced\": True,\n \"info\": \"The maximum number of tokens to generate. Set to 0 for unlimited tokens.\",\n },\n \"model_kwargs\": {\n \"display_name\": \"Model Kwargs\",\n \"advanced\": True,\n },\n \"model_name\": {\n \"display_name\": \"Model Name\",\n \"advanced\": False,\n \"options\": MODEL_NAMES,\n },\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"advanced\": True,\n \"info\": (\n \"The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\\n\\n\"\n \"You can change this to use other APIs like JinaChat, LocalAI and Prem.\"\n ),\n },\n \"openai_api_key\": {\n \"display_name\": \"OpenAI API Key\",\n \"info\": \"The OpenAI API Key to use for the OpenAI model.\",\n \"advanced\": False,\n \"password\": True,\n },\n \"temperature\": {\n \"display_name\": \"Temperature\",\n \"advanced\": False,\n \"value\": 0.1,\n },\n \"stream\": {\n \"display_name\": \"Stream\",\n \"info\": STREAM_INFO_TEXT,\n \"advanced\": True,\n },\n \"system_message\": {\n \"display_name\": \"System Message\",\n \"info\": \"System message to pass to the model.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n input_value: Text,\n openai_api_key: str,\n temperature: float = 0.1,\n model_name: str = \"gpt-4o\",\n max_tokens: Optional[int] = 256,\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n stream: bool = False,\n system_message: Optional[str] = None,\n ) -> Text:\n if not openai_api_base:\n openai_api_base = \"https://api.openai.com/v1\"\n if openai_api_key:\n api_key = SecretStr(openai_api_key)\n else:\n api_key = None\n\n output = ChatOpenAI(\n max_tokens=max_tokens or None,\n model_kwargs=model_kwargs,\n model=model_name,\n base_url=openai_api_base,\n api_key=api_key,\n temperature=temperature,\n )\n\n return self.get_chat_result(output, stream, input_value, system_message)\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "max_tokens": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 256,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "max_tokens",
+ "display_name": "Max Tokens",
+ "advanced": true,
+ "dynamic": false,
+ "info": "The maximum number of tokens to generate. Set to 0 for unlimited tokens.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "model_kwargs": {
+ "type": "NestedDict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": {},
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "model_kwargs",
+ "display_name": "Model Kwargs",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "model_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "gpt-4-turbo-preview",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": [
+ "gpt-4o",
+ "gpt-4-turbo",
+ "gpt-4-turbo-preview",
+ "gpt-3.5-turbo",
+ "gpt-3.5-turbo-0125"
+ ],
+ "name": "model_name",
+ "display_name": "Model Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_base": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "openai_api_base",
+ "display_name": "OpenAI API Base",
+ "advanced": true,
+ "dynamic": false,
+ "info": "The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\n\nYou can change this to use other APIs like JinaChat, LocalAI and Prem.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_key": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "openai_api_key",
+ "display_name": "OpenAI API Key",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The OpenAI API Key to use for the OpenAI model.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "OPENAI_API_KEY"
+ },
+ "stream": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "stream",
+ "display_name": "Stream",
+ "advanced": false,
+ "dynamic": false,
+ "info": "Stream the response from the model. Streaming works only in Chat.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "system_message": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "system_message",
+ "display_name": "System Message",
+ "advanced": true,
+ "dynamic": false,
+ "info": "System message to pass to the model.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "temperature": {
+ "type": "float",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 0.1,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "temperature",
+ "display_name": "Temperature",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "rangeSpec": {
+ "step_type": "float",
+ "min": -1,
+ "max": 1,
+ "step": 0.1
},
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-OpenAIModel-Bt067{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153str\u0153,\u0153Text\u0153],\u0153dataType\u0153:\u0153OpenAIModel\u0153,\u0153id\u0153:\u0153OpenAIModel-Bt067\u0153}-ChatOutput-F5Awj{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153ChatOutput-F5Awj\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
- }
- ],
- "viewport": {
- "x": 352.20899206064655,
- "y": 56.054900898593075,
- "zoom": 0.9023391400011
- }
- },
- "description": "This flow integrates PDF reading with a language model to answer document-specific questions. Ideal for small-scale texts, it facilitates direct queries with immediate insights.",
- "name": "Document QA",
- "last_tested_version": "1.0.0a0",
- "is_component": false
+ "load_from_db": false,
+ "title_case": false
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Generates text using OpenAI LLMs.",
+ "icon": "OpenAI",
+ "base_classes": ["object", "str", "Text"],
+ "display_name": "OpenAI",
+ "documentation": "",
+ "custom_fields": {
+ "input_value": null,
+ "openai_api_key": null,
+ "temperature": null,
+ "model_name": null,
+ "max_tokens": null,
+ "model_kwargs": null,
+ "openai_api_base": null,
+ "stream": null,
+ "system_message": null
+ },
+ "output_types": ["Text"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [
+ "max_tokens",
+ "model_kwargs",
+ "model_name",
+ "openai_api_base",
+ "openai_api_key",
+ "temperature",
+ "input_value",
+ "system_message",
+ "stream"
+ ],
+ "beta": false
+ },
+ "id": "OpenAIModel-Bt067"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 642,
+ "positionAbsolute": {
+ "x": 1137.6078582863759,
+ "y": -14.41920034020356
+ },
+ "dragging": false
+ }
+ ],
+ "edges": [
+ {
+ "source": "ChatInput-MsSJ9",
+ "sourceHandle": "{œbaseClassesœ:[œstrœ,œRecordœ,œTextœ,œobjectœ],œdataTypeœ:œChatInputœ,œidœ:œChatInput-MsSJ9œ}",
+ "target": "Prompt-tHwPf",
+ "targetHandle": "{œfieldNameœ:œQuestionœ,œidœ:œPrompt-tHwPfœ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "Question",
+ "id": "Prompt-tHwPf",
+ "inputTypes": ["Document", "BaseOutputParser", "Record", "Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["str", "Record", "Text", "object"],
+ "dataType": "ChatInput",
+ "id": "ChatInput-MsSJ9"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-ChatInput-MsSJ9{œbaseClassesœ:[œstrœ,œRecordœ,œTextœ,œobjectœ],œdataTypeœ:œChatInputœ,œidœ:œChatInput-MsSJ9œ}-Prompt-tHwPf{œfieldNameœ:œQuestionœ,œidœ:œPrompt-tHwPfœ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}"
+ },
+ {
+ "source": "File-6TEsD",
+ "sourceHandle": "{œbaseClassesœ:[œRecordœ],œdataTypeœ:œFileœ,œidœ:œFile-6TEsDœ}",
+ "target": "Prompt-tHwPf",
+ "targetHandle": "{œfieldNameœ:œDocumentœ,œidœ:œPrompt-tHwPfœ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "Document",
+ "id": "Prompt-tHwPf",
+ "inputTypes": ["Document", "BaseOutputParser", "Record", "Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["Record"],
+ "dataType": "File",
+ "id": "File-6TEsD"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-File-6TEsD{œbaseClassesœ:[œRecordœ],œdataTypeœ:œFileœ,œidœ:œFile-6TEsDœ}-Prompt-tHwPf{œfieldNameœ:œDocumentœ,œidœ:œPrompt-tHwPfœ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}"
+ },
+ {
+ "source": "Prompt-tHwPf",
+ "sourceHandle": "{œbaseClassesœ:[œobjectœ,œstrœ,œTextœ],œdataTypeœ:œPromptœ,œidœ:œPrompt-tHwPfœ}",
+ "target": "OpenAIModel-Bt067",
+ "targetHandle": "{œfieldNameœ:œinput_valueœ,œidœ:œOpenAIModel-Bt067œ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "OpenAIModel-Bt067",
+ "inputTypes": ["Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["object", "str", "Text"],
+ "dataType": "Prompt",
+ "id": "Prompt-tHwPf"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-Prompt-tHwPf{œbaseClassesœ:[œobjectœ,œstrœ,œTextœ],œdataTypeœ:œPromptœ,œidœ:œPrompt-tHwPfœ}-OpenAIModel-Bt067{œfieldNameœ:œinput_valueœ,œidœ:œOpenAIModel-Bt067œ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}"
+ },
+ {
+ "source": "OpenAIModel-Bt067",
+ "sourceHandle": "{œbaseClassesœ:[œobjectœ,œstrœ,œTextœ],œdataTypeœ:œOpenAIModelœ,œidœ:œOpenAIModel-Bt067œ}",
+ "target": "ChatOutput-F5Awj",
+ "targetHandle": "{œfieldNameœ:œinput_valueœ,œidœ:œChatOutput-F5Awjœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "ChatOutput-F5Awj",
+ "inputTypes": ["Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["object", "str", "Text"],
+ "dataType": "OpenAIModel",
+ "id": "OpenAIModel-Bt067"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-OpenAIModel-Bt067{œbaseClassesœ:[œobjectœ,œstrœ,œTextœ],œdataTypeœ:œOpenAIModelœ,œidœ:œOpenAIModel-Bt067œ}-ChatOutput-F5Awj{œfieldNameœ:œinput_valueœ,œidœ:œChatOutput-F5Awjœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}"
+ }
+ ],
+ "viewport": {
+ "x": 352.20899206064655,
+ "y": 56.054900898593075,
+ "zoom": 0.9023391400011
+ }
+ },
+ "description": "This flow integrates PDF reading with a language model to answer document-specific questions. Ideal for small-scale texts, it facilitates direct queries with immediate insights.",
+ "name": "Document QA",
+ "last_tested_version": "1.0.0a0",
+ "is_component": false
}
diff --git a/src/backend/base/langflow/initial_setup/starter_projects/Langflow Memory Conversation.json b/src/backend/base/langflow/initial_setup/starter_projects/Langflow Memory Conversation.json
index 235af8f6f..1002b721a 100644
--- a/src/backend/base/langflow/initial_setup/starter_projects/Langflow Memory Conversation.json
+++ b/src/backend/base/langflow/initial_setup/starter_projects/Langflow Memory Conversation.json
@@ -1,1272 +1,1137 @@
{
- "id": "08d5cccf-d098-4367-b14b-1078429c9ed9",
- "icon": "\ud83e\udd16",
- "icon_bg_color": "#FFD700",
- "data": {
- "nodes": [
- {
- "id": "ChatInput-t7F8v",
- "type": "genericNode",
- "position": {
- "x": 1283.2700598313072,
- "y": 982.5953650473145
- },
- "data": {
- "type": "ChatInput",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n\n def build_config(self):\n build_config = super().build_config()\n build_config[\"input_value\"] = {\n \"input_types\": [],\n \"display_name\": \"Message\",\n \"multiline\": True,\n }\n\n return build_config\n\n def build(\n self,\n sender: Optional[str] = \"User\",\n sender_name: Optional[str] = \"User\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n ) -> Union[Text, Record]:\n return super().build_no_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n )\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "input_value": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Message",
- "advanced": false,
- "input_types": [],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "value": ""
- },
- "return_record": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "return_record",
- "display_name": "Return Record",
- "advanced": true,
- "dynamic": false,
- "info": "Return the message as a record containing the sender, sender_name, and session_id.",
- "load_from_db": false,
- "title_case": false
- },
- "sender": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "User",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "Machine",
- "User"
- ],
- "name": "sender",
- "display_name": "Sender Type",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "sender_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "User",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "sender_name",
- "display_name": "Sender Name",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "session_id": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "session_id",
- "display_name": "Session ID",
- "advanced": false,
- "dynamic": false,
- "info": "If provided, the message will be stored in the memory.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "MySessionID"
- },
- "_type": "CustomComponent"
- },
- "description": "Get chat inputs from the Playground.",
- "icon": "ChatInput",
- "base_classes": [
- "Text",
- "object",
- "Record",
- "str"
- ],
- "display_name": "Chat Input",
- "documentation": "",
- "custom_fields": {
- "sender": null,
- "sender_name": null,
- "input_value": null,
- "session_id": null,
- "return_record": null
- },
- "output_types": [
- "Text",
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "ChatInput-t7F8v"
- },
- "selected": false,
- "width": 384,
- "height": 469,
- "positionAbsolute": {
- "x": 1283.2700598313072,
- "y": 982.5953650473145
- },
- "dragging": false
+ "id": "08d5cccf-d098-4367-b14b-1078429c9ed9",
+ "icon": "🤖",
+ "icon_bg_color": "#FFD700",
+ "data": {
+ "nodes": [
+ {
+ "id": "ChatInput-t7F8v",
+ "type": "genericNode",
+ "position": {
+ "x": 1283.2700598313072,
+ "y": 982.5953650473145
+ },
+ "data": {
+ "type": "ChatInput",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n\n def build_config(self):\n build_config = super().build_config()\n build_config[\"input_value\"] = {\n \"input_types\": [],\n \"display_name\": \"Message\",\n \"multiline\": True,\n }\n\n return build_config\n\n def build(\n self,\n sender: Optional[str] = \"User\",\n sender_name: Optional[str] = \"User\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n ) -> Union[Text, Record]:\n return super().build_no_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n )\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "input_value": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Message",
+ "advanced": false,
+ "input_types": [],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "value": ""
+ },
+ "return_record": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "return_record",
+ "display_name": "Return Record",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Return the message as a record containing the sender, sender_name, and session_id.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "sender": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "User",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["Machine", "User"],
+ "name": "sender",
+ "display_name": "Sender Type",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "sender_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "User",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "sender_name",
+ "display_name": "Sender Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "session_id": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "session_id",
+ "display_name": "Session ID",
+ "advanced": false,
+ "dynamic": false,
+ "info": "If provided, the message will be stored in the memory.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "MySessionID"
+ },
+ "_type": "CustomComponent"
},
- {
- "id": "ChatOutput-P1jEe",
- "type": "genericNode",
- "position": {
- "x": 3154.916355514023,
- "y": 851.051882666333
- },
- "data": {
- "type": "ChatOutput",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n\n def build(\n self,\n sender: Optional[str] = \"Machine\",\n sender_name: Optional[str] = \"AI\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n record_template: Optional[str] = \"{text}\",\n ) -> Union[Text, Record]:\n return super().build_with_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n record_template=record_template or \"\",\n )\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "input_value": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Message",
- "advanced": false,
- "input_types": [
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "return_record": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "return_record",
- "display_name": "Return Record",
- "advanced": true,
- "dynamic": false,
- "info": "Return the message as a record containing the sender, sender_name, and session_id.",
- "load_from_db": false,
- "title_case": false
- },
- "sender": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "Machine",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "Machine",
- "User"
- ],
- "name": "sender",
- "display_name": "Sender Type",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "sender_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "AI",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "sender_name",
- "display_name": "Sender Name",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "session_id": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "session_id",
- "display_name": "Session ID",
- "advanced": false,
- "dynamic": false,
- "info": "If provided, the message will be stored in the memory.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "MySessionID"
- },
- "_type": "CustomComponent"
- },
- "description": "Display a chat message in the Playground.",
- "icon": "ChatOutput",
- "base_classes": [
- "Text",
- "object",
- "Record",
- "str"
- ],
- "display_name": "Chat Output",
- "documentation": "",
- "custom_fields": {
- "sender": null,
- "sender_name": null,
- "input_value": null,
- "session_id": null,
- "return_record": null
- },
- "output_types": [
- "Text",
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "ChatOutput-P1jEe"
- },
- "selected": false,
- "width": 384,
- "height": 477,
- "dragging": false,
- "positionAbsolute": {
- "x": 3154.916355514023,
- "y": 851.051882666333
- }
+ "description": "Get chat inputs from the Playground.",
+ "icon": "ChatInput",
+ "base_classes": ["Text", "object", "Record", "str"],
+ "display_name": "Chat Input",
+ "documentation": "",
+ "custom_fields": {
+ "sender": null,
+ "sender_name": null,
+ "input_value": null,
+ "session_id": null,
+ "return_record": null
},
- {
- "id": "MemoryComponent-cdA1J",
- "type": "genericNode",
- "position": {
- "x": 1289.9606870058817,
- "y": 442.16804561053766
- },
- "data": {
- "type": "MemoryComponent",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional\n\nfrom langflow.base.memory.memory import BaseMemoryComponent\nfrom langflow.field_typing import Text\nfrom langflow.helpers.record import records_to_text\nfrom langflow.memory import get_messages\nfrom langflow.schema.schema import Record\n\n\nclass MemoryComponent(BaseMemoryComponent):\n display_name = \"Chat Memory\"\n description = \"Retrieves stored chat messages given a specific Session ID.\"\n beta: bool = True\n icon = \"history\"\n\n def build_config(self):\n return {\n \"sender\": {\n \"options\": [\"Machine\", \"User\", \"Machine and User\"],\n \"display_name\": \"Sender Type\",\n },\n \"sender_name\": {\"display_name\": \"Sender Name\", \"advanced\": True},\n \"n_messages\": {\n \"display_name\": \"Number of Messages\",\n \"info\": \"Number of messages to retrieve.\",\n },\n \"session_id\": {\n \"display_name\": \"Session ID\",\n \"info\": \"Session ID of the chat history.\",\n \"input_types\": [\"Text\"],\n },\n \"order\": {\n \"options\": [\"Ascending\", \"Descending\"],\n \"display_name\": \"Order\",\n \"info\": \"Order of the messages.\",\n \"advanced\": True,\n },\n \"record_template\": {\n \"display_name\": \"Record Template\",\n \"multiline\": True,\n \"info\": \"Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.\",\n \"advanced\": True,\n },\n }\n\n def get_messages(self, **kwargs) -> list[Record]:\n # Validate kwargs by checking if it contains the correct keys\n if \"sender\" not in kwargs:\n kwargs[\"sender\"] = None\n if \"sender_name\" not in kwargs:\n kwargs[\"sender_name\"] = None\n if \"session_id\" not in kwargs:\n kwargs[\"session_id\"] = None\n if \"limit\" not in kwargs:\n kwargs[\"limit\"] = 5\n if \"order\" not in kwargs:\n kwargs[\"order\"] = \"Descending\"\n\n kwargs[\"order\"] = \"DESC\" if kwargs[\"order\"] == \"Descending\" else \"ASC\"\n if kwargs[\"sender\"] == \"Machine and User\":\n kwargs[\"sender\"] = None\n return get_messages(**kwargs)\n\n def build(\n self,\n sender: Optional[str] = \"Machine and User\",\n sender_name: Optional[str] = None,\n session_id: Optional[str] = None,\n n_messages: int = 5,\n order: Optional[str] = \"Descending\",\n record_template: Optional[str] = \"{sender_name}: {text}\",\n ) -> Text:\n messages = self.get_messages(\n sender=sender,\n sender_name=sender_name,\n session_id=session_id,\n limit=n_messages,\n order=order,\n )\n messages_str = records_to_text(template=record_template or \"\", records=messages)\n self.status = messages_str\n return messages_str\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "n_messages": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 5,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "n_messages",
- "display_name": "Number of Messages",
- "advanced": false,
- "dynamic": false,
- "info": "Number of messages to retrieve.",
- "load_from_db": false,
- "title_case": false
- },
- "order": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "Descending",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "Ascending",
- "Descending"
- ],
- "name": "order",
- "display_name": "Order",
- "advanced": true,
- "dynamic": false,
- "info": "Order of the messages.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "record_template": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "{sender_name}: {text}",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "record_template",
- "display_name": "Record Template",
- "advanced": true,
- "dynamic": false,
- "info": "Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "sender": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "Machine and User",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "Machine",
- "User",
- "Machine and User"
- ],
- "name": "sender",
- "display_name": "Sender Type",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "sender_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "sender_name",
- "display_name": "Sender Name",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "session_id": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "session_id",
- "display_name": "Session ID",
- "advanced": false,
- "input_types": [
- "Text"
- ],
- "dynamic": false,
- "info": "Session ID of the chat history.",
- "load_from_db": false,
- "title_case": false,
- "value": "MySessionID"
- },
- "_type": "CustomComponent"
- },
- "description": "Retrieves stored chat messages given a specific Session ID.",
- "icon": "history",
- "base_classes": [
- "str",
- "Text",
- "object"
- ],
- "display_name": "Chat Memory",
- "documentation": "",
- "custom_fields": {
- "sender": null,
- "sender_name": null,
- "session_id": null,
- "n_messages": null,
- "order": null,
- "record_template": null
- },
- "output_types": [
- "Text"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": true
- },
- "id": "MemoryComponent-cdA1J",
- "description": "Retrieves stored chat messages given a specific Session ID.",
- "display_name": "Chat Memory"
- },
- "selected": false,
- "width": 384,
- "height": 489,
- "dragging": false,
- "positionAbsolute": {
- "x": 1289.9606870058817,
- "y": 442.16804561053766
- }
+ "output_types": ["Text", "Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "ChatInput-t7F8v"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 469,
+ "positionAbsolute": {
+ "x": 1283.2700598313072,
+ "y": 982.5953650473145
+ },
+ "dragging": false
+ },
+ {
+ "id": "ChatOutput-P1jEe",
+ "type": "genericNode",
+ "position": {
+ "x": 3154.916355514023,
+ "y": 851.051882666333
+ },
+ "data": {
+ "type": "ChatOutput",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n\n def build(\n self,\n sender: Optional[str] = \"Machine\",\n sender_name: Optional[str] = \"AI\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n record_template: Optional[str] = \"{text}\",\n ) -> Union[Text, Record]:\n return super().build_with_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n record_template=record_template or \"\",\n )\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "input_value": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Message",
+ "advanced": false,
+ "input_types": ["Text"],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "return_record": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "return_record",
+ "display_name": "Return Record",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Return the message as a record containing the sender, sender_name, and session_id.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "sender": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "Machine",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["Machine", "User"],
+ "name": "sender",
+ "display_name": "Sender Type",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "sender_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "AI",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "sender_name",
+ "display_name": "Sender Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "session_id": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "session_id",
+ "display_name": "Session ID",
+ "advanced": false,
+ "dynamic": false,
+ "info": "If provided, the message will be stored in the memory.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "MySessionID"
+ },
+ "_type": "CustomComponent"
},
- {
- "id": "Prompt-ODkUx",
- "type": "genericNode",
- "position": {
- "x": 1894.594426342426,
- "y": 753.3797365481901
- },
- "data": {
- "type": "Prompt",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from langchain_core.prompts import PromptTemplate\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Prompt, TemplateField, Text\n\n\nclass PromptComponent(CustomComponent):\n display_name: str = \"Prompt\"\n description: str = \"Create a prompt template with dynamic variables.\"\n icon = \"prompts\"\n\n def build_config(self):\n return {\n \"template\": TemplateField(display_name=\"Template\"),\n \"code\": TemplateField(advanced=True),\n }\n\n def build(\n self,\n template: Prompt,\n **kwargs,\n ) -> Text:\n from langflow.base.prompts.utils import dict_values_to_string\n\n prompt_template = PromptTemplate.from_template(Text(template))\n kwargs = dict_values_to_string(kwargs)\n kwargs = {k: \"\\n\".join(v) if isinstance(v, list) else v for k, v in kwargs.items()}\n try:\n formated_prompt = prompt_template.format(**kwargs)\n except Exception as exc:\n raise ValueError(f\"Error formatting prompt: {exc}\") from exc\n self.status = f'Prompt:\\n\"{formated_prompt}\"'\n return formated_prompt\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "template": {
- "type": "prompt",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "{context}\n\nUser: {user_message}\nAI: ",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "template",
- "display_name": "Template",
- "advanced": false,
- "input_types": [
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "_type": "CustomComponent",
- "context": {
- "field_type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "context",
- "display_name": "context",
- "advanced": false,
- "input_types": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "type": "str"
- },
- "user_message": {
- "field_type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "user_message",
- "display_name": "user_message",
- "advanced": false,
- "input_types": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "type": "str"
- }
- },
- "description": "Create a prompt template with dynamic variables.",
- "icon": "prompts",
- "is_input": null,
- "is_output": null,
- "is_composition": null,
- "base_classes": [
- "Text",
- "str",
- "object"
- ],
- "name": "",
- "display_name": "Prompt",
- "documentation": "",
- "custom_fields": {
- "template": [
- "context",
- "user_message"
- ]
- },
- "output_types": [
- "Text"
- ],
- "full_path": null,
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false,
- "error": null
- },
- "id": "Prompt-ODkUx",
- "description": "A component for creating prompt templates using dynamic variables.",
- "display_name": "Prompt"
- },
- "selected": false,
- "width": 384,
- "height": 477,
- "dragging": false,
- "positionAbsolute": {
- "x": 1894.594426342426,
- "y": 753.3797365481901
- }
+ "description": "Display a chat message in the Playground.",
+ "icon": "ChatOutput",
+ "base_classes": ["Text", "object", "Record", "str"],
+ "display_name": "Chat Output",
+ "documentation": "",
+ "custom_fields": {
+ "sender": null,
+ "sender_name": null,
+ "input_value": null,
+ "session_id": null,
+ "return_record": null
},
- {
- "id": "OpenAIModel-9RykF",
- "type": "genericNode",
- "position": {
- "x": 2561.5850334731617,
- "y": 553.2745131130916
- },
- "data": {
- "type": "OpenAIModel",
- "node": {
- "template": {
- "input_value": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Input",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional\n\nfrom langchain_openai import ChatOpenAI\nfrom pydantic.v1 import SecretStr\n\nfrom langflow.base.constants import STREAM_INFO_TEXT\nfrom langflow.base.models.model import LCModelComponent\nfrom langflow.base.models.openai_constants import MODEL_NAMES\nfrom langflow.field_typing import NestedDict, Text\n\n\nclass OpenAIModelComponent(LCModelComponent):\n display_name = \"OpenAI\"\n description = \"Generates text using OpenAI LLMs.\"\n icon = \"OpenAI\"\n\n field_order = [\n \"max_tokens\",\n \"model_kwargs\",\n \"model_name\",\n \"openai_api_base\",\n \"openai_api_key\",\n \"temperature\",\n \"input_value\",\n \"system_message\",\n \"stream\",\n ]\n\n def build_config(self):\n return {\n \"input_value\": {\"display_name\": \"Input\"},\n \"max_tokens\": {\n \"display_name\": \"Max Tokens\",\n \"advanced\": True,\n \"info\": \"The maximum number of tokens to generate. Set to 0 for unlimited tokens.\",\n },\n \"model_kwargs\": {\n \"display_name\": \"Model Kwargs\",\n \"advanced\": True,\n },\n \"model_name\": {\n \"display_name\": \"Model Name\",\n \"advanced\": False,\n \"options\": MODEL_NAMES,\n },\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"advanced\": True,\n \"info\": (\n \"The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\\n\\n\"\n \"You can change this to use other APIs like JinaChat, LocalAI and Prem.\"\n ),\n },\n \"openai_api_key\": {\n \"display_name\": \"OpenAI API Key\",\n \"info\": \"The OpenAI API Key to use for the OpenAI model.\",\n \"advanced\": False,\n \"password\": True,\n },\n \"temperature\": {\n \"display_name\": \"Temperature\",\n \"advanced\": False,\n \"value\": 0.1,\n },\n \"stream\": {\n \"display_name\": \"Stream\",\n \"info\": STREAM_INFO_TEXT,\n \"advanced\": True,\n },\n \"system_message\": {\n \"display_name\": \"System Message\",\n \"info\": \"System message to pass to the model.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n input_value: Text,\n openai_api_key: str,\n temperature: float,\n model_name: str = \"gpt-4o\",\n max_tokens: Optional[int] = 256,\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n stream: bool = False,\n system_message: Optional[str] = None,\n ) -> Text:\n if not openai_api_base:\n openai_api_base = \"https://api.openai.com/v1\"\n if openai_api_key:\n api_key = SecretStr(openai_api_key)\n else:\n api_key = None\n\n output = ChatOpenAI(\n max_tokens=max_tokens or None,\n model_kwargs=model_kwargs,\n model=model_name,\n base_url=openai_api_base,\n api_key=api_key,\n temperature=temperature,\n )\n\n return self.get_chat_result(output, stream, input_value, system_message)\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "max_tokens": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 256,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "max_tokens",
- "display_name": "Max Tokens",
- "advanced": true,
- "dynamic": false,
- "info": "The maximum number of tokens to generate. Set to 0 for unlimited tokens.",
- "load_from_db": false,
- "title_case": false
- },
- "model_kwargs": {
- "type": "NestedDict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": {},
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "model_kwargs",
- "display_name": "Model Kwargs",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "model_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "gpt-4-1106-preview",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "gpt-4o",
- "gpt-4-turbo",
- "gpt-4-turbo-preview",
- "gpt-3.5-turbo",
- "gpt-3.5-turbo-0125"
- ],
- "name": "model_name",
- "display_name": "Model Name",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_base": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "openai_api_base",
- "display_name": "OpenAI API Base",
- "advanced": true,
- "dynamic": false,
- "info": "The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\n\nYou can change this to use other APIs like JinaChat, LocalAI and Prem.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_key": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "openai_api_key",
- "display_name": "OpenAI API Key",
- "advanced": false,
- "dynamic": false,
- "info": "The OpenAI API Key to use for the OpenAI model.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "OPENAI_API_KEY"
- },
- "stream": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "stream",
- "display_name": "Stream",
- "advanced": true,
- "dynamic": false,
- "info": "Stream the response from the model. Streaming works only in Chat.",
- "load_from_db": false,
- "title_case": false
- },
- "system_message": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "system_message",
- "display_name": "System Message",
- "advanced": true,
- "dynamic": false,
- "info": "System message to pass to the model.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "temperature": {
- "type": "float",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "0.2",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "temperature",
- "display_name": "Temperature",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "rangeSpec": {
- "step_type": "float",
- "min": -1,
- "max": 1,
- "step": 0.1
- },
- "load_from_db": false,
- "title_case": false
- },
- "_type": "CustomComponent"
- },
- "description": "Generates text using OpenAI LLMs.",
- "icon": "OpenAI",
- "base_classes": [
- "str",
- "object",
- "Text"
- ],
- "display_name": "OpenAI",
- "documentation": "",
- "custom_fields": {
- "input_value": null,
- "openai_api_key": null,
- "temperature": null,
- "model_name": null,
- "max_tokens": null,
- "model_kwargs": null,
- "openai_api_base": null,
- "stream": null,
- "system_message": null
- },
- "output_types": [
- "Text"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [
- "max_tokens",
- "model_kwargs",
- "model_name",
- "openai_api_base",
- "openai_api_key",
- "temperature",
- "input_value",
- "system_message",
- "stream"
- ],
- "beta": false
- },
- "id": "OpenAIModel-9RykF"
- },
- "selected": false,
- "width": 384,
- "height": 563,
- "positionAbsolute": {
- "x": 2561.5850334731617,
- "y": 553.2745131130916
- },
- "dragging": false
- },
- {
- "id": "TextOutput-vrs6T",
- "type": "genericNode",
- "position": {
- "x": 1911.4785906252087,
- "y": 247.39079954376987
- },
- "data": {
- "type": "TextOutput",
- "node": {
- "template": {
- "input_value": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Value",
- "advanced": false,
- "input_types": [
- "Record",
- "Text"
- ],
- "dynamic": false,
- "info": "Text or Record to be passed as output.",
- "load_from_db": false,
- "title_case": false
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional\n\nfrom langflow.base.io.text import TextComponent\nfrom langflow.field_typing import Text\n\n\nclass TextOutput(TextComponent):\n display_name = \"Text Output\"\n description = \"Display a text output in the Playground.\"\n icon = \"type\"\n\n def build_config(self):\n return {\n \"input_value\": {\n \"display_name\": \"Value\",\n \"input_types\": [\"Record\", \"Text\"],\n \"info\": \"Text or Record to be passed as output.\",\n },\n \"record_template\": {\n \"display_name\": \"Record Template\",\n \"multiline\": True,\n \"info\": \"Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.\",\n \"advanced\": True,\n },\n }\n\n def build(self, input_value: Optional[Text] = \"\", record_template: str = \"\") -> Text:\n return super().build(input_value=input_value, record_template=record_template)\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "record_template": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "record_template",
- "display_name": "Record Template",
- "advanced": true,
- "dynamic": false,
- "info": "Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Display a text output in the Playground.",
- "icon": "type",
- "base_classes": [
- "str",
- "object",
- "Text"
- ],
- "display_name": "Inspect Memory",
- "documentation": "",
- "custom_fields": {
- "input_value": null,
- "record_template": null
- },
- "output_types": [
- "Text"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "TextOutput-vrs6T"
- },
- "selected": false,
- "width": 384,
- "height": 289,
- "positionAbsolute": {
- "x": 1911.4785906252087,
- "y": 247.39079954376987
- },
- "dragging": false
- }
- ],
- "edges": [
- {
- "source": "MemoryComponent-cdA1J",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153str\u0153,\u0153Text\u0153,\u0153object\u0153],\u0153dataType\u0153:\u0153MemoryComponent\u0153,\u0153id\u0153:\u0153MemoryComponent-cdA1J\u0153}",
- "target": "Prompt-ODkUx",
- "targetHandle": "{\u0153fieldName\u0153:\u0153context\u0153,\u0153id\u0153:\u0153Prompt-ODkUx\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "context",
- "type": "str",
- "id": "Prompt-ODkUx",
- "inputTypes": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ]
- },
- "sourceHandle": {
- "baseClasses": [
- "str",
- "Text",
- "object"
- ],
- "dataType": "MemoryComponent",
- "id": "MemoryComponent-cdA1J"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-MemoryComponent-cdA1J{\u0153baseClasses\u0153:[\u0153str\u0153,\u0153Text\u0153,\u0153object\u0153],\u0153dataType\u0153:\u0153MemoryComponent\u0153,\u0153id\u0153:\u0153MemoryComponent-cdA1J\u0153}-Prompt-ODkUx{\u0153fieldName\u0153:\u0153context\u0153,\u0153id\u0153:\u0153Prompt-ODkUx\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "selected": false
- },
- {
- "source": "ChatInput-t7F8v",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Text\u0153,\u0153object\u0153,\u0153Record\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153ChatInput\u0153,\u0153id\u0153:\u0153ChatInput-t7F8v\u0153}",
- "target": "Prompt-ODkUx",
- "targetHandle": "{\u0153fieldName\u0153:\u0153user_message\u0153,\u0153id\u0153:\u0153Prompt-ODkUx\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "user_message",
- "type": "str",
- "id": "Prompt-ODkUx",
- "inputTypes": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ]
- },
- "sourceHandle": {
- "baseClasses": [
- "Text",
- "object",
- "Record",
- "str"
- ],
- "dataType": "ChatInput",
- "id": "ChatInput-t7F8v"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-ChatInput-t7F8v{\u0153baseClasses\u0153:[\u0153Text\u0153,\u0153object\u0153,\u0153Record\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153ChatInput\u0153,\u0153id\u0153:\u0153ChatInput-t7F8v\u0153}-Prompt-ODkUx{\u0153fieldName\u0153:\u0153user_message\u0153,\u0153id\u0153:\u0153Prompt-ODkUx\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "selected": false
- },
- {
- "source": "Prompt-ODkUx",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Text\u0153,\u0153str\u0153,\u0153object\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-ODkUx\u0153}",
- "target": "OpenAIModel-9RykF",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153OpenAIModel-9RykF\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "OpenAIModel-9RykF",
- "inputTypes": [
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "Text",
- "str",
- "object"
- ],
- "dataType": "Prompt",
- "id": "Prompt-ODkUx"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-Prompt-ODkUx{\u0153baseClasses\u0153:[\u0153Text\u0153,\u0153str\u0153,\u0153object\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-ODkUx\u0153}-OpenAIModel-9RykF{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153OpenAIModel-9RykF\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
- },
- {
- "source": "OpenAIModel-9RykF",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153str\u0153,\u0153object\u0153,\u0153Text\u0153],\u0153dataType\u0153:\u0153OpenAIModel\u0153,\u0153id\u0153:\u0153OpenAIModel-9RykF\u0153}",
- "target": "ChatOutput-P1jEe",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153ChatOutput-P1jEe\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "ChatOutput-P1jEe",
- "inputTypes": [
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "str",
- "object",
- "Text"
- ],
- "dataType": "OpenAIModel",
- "id": "OpenAIModel-9RykF"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-OpenAIModel-9RykF{\u0153baseClasses\u0153:[\u0153str\u0153,\u0153object\u0153,\u0153Text\u0153],\u0153dataType\u0153:\u0153OpenAIModel\u0153,\u0153id\u0153:\u0153OpenAIModel-9RykF\u0153}-ChatOutput-P1jEe{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153ChatOutput-P1jEe\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
- },
- {
- "source": "MemoryComponent-cdA1J",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153str\u0153,\u0153Text\u0153,\u0153object\u0153],\u0153dataType\u0153:\u0153MemoryComponent\u0153,\u0153id\u0153:\u0153MemoryComponent-cdA1J\u0153}",
- "target": "TextOutput-vrs6T",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153TextOutput-vrs6T\u0153,\u0153inputTypes\u0153:[\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "TextOutput-vrs6T",
- "inputTypes": [
- "Record",
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "str",
- "Text",
- "object"
- ],
- "dataType": "MemoryComponent",
- "id": "MemoryComponent-cdA1J"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-foreground stroke-connection",
- "id": "reactflow__edge-MemoryComponent-cdA1J{\u0153baseClasses\u0153:[\u0153str\u0153,\u0153Text\u0153,\u0153object\u0153],\u0153dataType\u0153:\u0153MemoryComponent\u0153,\u0153id\u0153:\u0153MemoryComponent-cdA1J\u0153}-TextOutput-vrs6T{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153TextOutput-vrs6T\u0153,\u0153inputTypes\u0153:[\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
- }
- ],
- "viewport": {
- "x": -569.862554459756,
- "y": -42.08339711050985,
- "zoom": 0.4868590524514978
+ "output_types": ["Text", "Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "ChatOutput-P1jEe"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 477,
+ "dragging": false,
+ "positionAbsolute": {
+ "x": 3154.916355514023,
+ "y": 851.051882666333
}
- },
- "description": "This project can be used as a starting point for building a Chat experience with user specific memory. You can set a different Session ID to start a new message history.",
- "name": "Memory Chatbot",
- "last_tested_version": "1.0.0a0",
- "is_component": false
+ },
+ {
+ "id": "MemoryComponent-cdA1J",
+ "type": "genericNode",
+ "position": {
+ "x": 1289.9606870058817,
+ "y": 442.16804561053766
+ },
+ "data": {
+ "type": "MemoryComponent",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional\n\nfrom langflow.base.memory.memory import BaseMemoryComponent\nfrom langflow.field_typing import Text\nfrom langflow.helpers.record import records_to_text\nfrom langflow.memory import get_messages\nfrom langflow.schema.schema import Record\n\n\nclass MemoryComponent(BaseMemoryComponent):\n display_name = \"Chat Memory\"\n description = \"Retrieves stored chat messages given a specific Session ID.\"\n beta: bool = True\n icon = \"history\"\n\n def build_config(self):\n return {\n \"sender\": {\n \"options\": [\"Machine\", \"User\", \"Machine and User\"],\n \"display_name\": \"Sender Type\",\n },\n \"sender_name\": {\"display_name\": \"Sender Name\", \"advanced\": True},\n \"n_messages\": {\n \"display_name\": \"Number of Messages\",\n \"info\": \"Number of messages to retrieve.\",\n },\n \"session_id\": {\n \"display_name\": \"Session ID\",\n \"info\": \"Session ID of the chat history.\",\n \"input_types\": [\"Text\"],\n },\n \"order\": {\n \"options\": [\"Ascending\", \"Descending\"],\n \"display_name\": \"Order\",\n \"info\": \"Order of the messages.\",\n \"advanced\": True,\n },\n \"record_template\": {\n \"display_name\": \"Record Template\",\n \"multiline\": True,\n \"info\": \"Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.\",\n \"advanced\": True,\n },\n }\n\n def get_messages(self, **kwargs) -> list[Record]:\n # Validate kwargs by checking if it contains the correct keys\n if \"sender\" not in kwargs:\n kwargs[\"sender\"] = None\n if \"sender_name\" not in kwargs:\n kwargs[\"sender_name\"] = None\n if \"session_id\" not in kwargs:\n kwargs[\"session_id\"] = None\n if \"limit\" not in kwargs:\n kwargs[\"limit\"] = 5\n if \"order\" not in kwargs:\n kwargs[\"order\"] = \"Descending\"\n\n kwargs[\"order\"] = \"DESC\" if kwargs[\"order\"] == \"Descending\" else \"ASC\"\n if kwargs[\"sender\"] == \"Machine and User\":\n kwargs[\"sender\"] = None\n return get_messages(**kwargs)\n\n def build(\n self,\n sender: Optional[str] = \"Machine and User\",\n sender_name: Optional[str] = None,\n session_id: Optional[str] = None,\n n_messages: int = 5,\n order: Optional[str] = \"Descending\",\n record_template: Optional[str] = \"{sender_name}: {text}\",\n ) -> Text:\n messages = self.get_messages(\n sender=sender,\n sender_name=sender_name,\n session_id=session_id,\n limit=n_messages,\n order=order,\n )\n messages_str = records_to_text(template=record_template or \"\", records=messages)\n self.status = messages_str\n return messages_str\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "n_messages": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 5,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "n_messages",
+ "display_name": "Number of Messages",
+ "advanced": false,
+ "dynamic": false,
+ "info": "Number of messages to retrieve.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "order": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "Descending",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["Ascending", "Descending"],
+ "name": "order",
+ "display_name": "Order",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Order of the messages.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "record_template": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "{sender_name}: {text}",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "record_template",
+ "display_name": "Record Template",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "sender": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "Machine and User",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["Machine", "User", "Machine and User"],
+ "name": "sender",
+ "display_name": "Sender Type",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "sender_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "sender_name",
+ "display_name": "Sender Name",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "session_id": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "session_id",
+ "display_name": "Session ID",
+ "advanced": false,
+ "input_types": ["Text"],
+ "dynamic": false,
+ "info": "Session ID of the chat history.",
+ "load_from_db": false,
+ "title_case": false,
+ "value": "MySessionID"
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Retrieves stored chat messages given a specific Session ID.",
+ "icon": "history",
+ "base_classes": ["str", "Text", "object"],
+ "display_name": "Chat Memory",
+ "documentation": "",
+ "custom_fields": {
+ "sender": null,
+ "sender_name": null,
+ "session_id": null,
+ "n_messages": null,
+ "order": null,
+ "record_template": null
+ },
+ "output_types": ["Text"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": true
+ },
+ "id": "MemoryComponent-cdA1J",
+ "description": "Retrieves stored chat messages given a specific Session ID.",
+ "display_name": "Chat Memory"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 489,
+ "dragging": false,
+ "positionAbsolute": {
+ "x": 1289.9606870058817,
+ "y": 442.16804561053766
+ }
+ },
+ {
+ "id": "Prompt-ODkUx",
+ "type": "genericNode",
+ "position": {
+ "x": 1894.594426342426,
+ "y": 753.3797365481901
+ },
+ "data": {
+ "type": "Prompt",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from langchain_core.prompts import PromptTemplate\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Prompt, TemplateField, Text\n\n\nclass PromptComponent(CustomComponent):\n display_name: str = \"Prompt\"\n description: str = \"Create a prompt template with dynamic variables.\"\n icon = \"prompts\"\n\n def build_config(self):\n return {\n \"template\": TemplateField(display_name=\"Template\"),\n \"code\": TemplateField(advanced=True),\n }\n\n def build(\n self,\n template: Prompt,\n **kwargs,\n ) -> Text:\n from langflow.base.prompts.utils import dict_values_to_string\n\n prompt_template = PromptTemplate.from_template(Text(template))\n kwargs = dict_values_to_string(kwargs)\n kwargs = {k: \"\\n\".join(v) if isinstance(v, list) else v for k, v in kwargs.items()}\n try:\n formated_prompt = prompt_template.format(**kwargs)\n except Exception as exc:\n raise ValueError(f\"Error formatting prompt: {exc}\") from exc\n self.status = f'Prompt:\\n\"{formated_prompt}\"'\n return formated_prompt\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "template": {
+ "type": "prompt",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "{context}\n\nUser: {user_message}\nAI: ",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "template",
+ "display_name": "Template",
+ "advanced": false,
+ "input_types": ["Text"],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "_type": "CustomComponent",
+ "context": {
+ "field_type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "context",
+ "display_name": "context",
+ "advanced": false,
+ "input_types": [
+ "Document",
+ "BaseOutputParser",
+ "Record",
+ "Text"
+ ],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "type": "str"
+ },
+ "user_message": {
+ "field_type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "user_message",
+ "display_name": "user_message",
+ "advanced": false,
+ "input_types": [
+ "Document",
+ "BaseOutputParser",
+ "Record",
+ "Text"
+ ],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "type": "str"
+ }
+ },
+ "description": "Create a prompt template with dynamic variables.",
+ "icon": "prompts",
+ "is_input": null,
+ "is_output": null,
+ "is_composition": null,
+ "base_classes": ["Text", "str", "object"],
+ "name": "",
+ "display_name": "Prompt",
+ "documentation": "",
+ "custom_fields": {
+ "template": ["context", "user_message"]
+ },
+ "output_types": ["Text"],
+ "full_path": null,
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false,
+ "error": null
+ },
+ "id": "Prompt-ODkUx",
+ "description": "A component for creating prompt templates using dynamic variables.",
+ "display_name": "Prompt"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 477,
+ "dragging": false,
+ "positionAbsolute": {
+ "x": 1894.594426342426,
+ "y": 753.3797365481901
+ }
+ },
+ {
+ "id": "OpenAIModel-9RykF",
+ "type": "genericNode",
+ "position": {
+ "x": 2561.5850334731617,
+ "y": 553.2745131130916
+ },
+ "data": {
+ "type": "OpenAIModel",
+ "node": {
+ "template": {
+ "input_value": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Input",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional\n\nfrom langchain_openai import ChatOpenAI\nfrom pydantic.v1 import SecretStr\n\nfrom langflow.base.constants import STREAM_INFO_TEXT\nfrom langflow.base.models.model import LCModelComponent\nfrom langflow.base.models.openai_constants import MODEL_NAMES\nfrom langflow.field_typing import NestedDict, Text\n\n\nclass OpenAIModelComponent(LCModelComponent):\n display_name = \"OpenAI\"\n description = \"Generates text using OpenAI LLMs.\"\n icon = \"OpenAI\"\n\n field_order = [\n \"max_tokens\",\n \"model_kwargs\",\n \"model_name\",\n \"openai_api_base\",\n \"openai_api_key\",\n \"temperature\",\n \"input_value\",\n \"system_message\",\n \"stream\",\n ]\n\n def build_config(self):\n return {\n \"input_value\": {\"display_name\": \"Input\"},\n \"max_tokens\": {\n \"display_name\": \"Max Tokens\",\n \"advanced\": True,\n \"info\": \"The maximum number of tokens to generate. Set to 0 for unlimited tokens.\",\n },\n \"model_kwargs\": {\n \"display_name\": \"Model Kwargs\",\n \"advanced\": True,\n },\n \"model_name\": {\n \"display_name\": \"Model Name\",\n \"advanced\": False,\n \"options\": MODEL_NAMES,\n },\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"advanced\": True,\n \"info\": (\n \"The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\\n\\n\"\n \"You can change this to use other APIs like JinaChat, LocalAI and Prem.\"\n ),\n },\n \"openai_api_key\": {\n \"display_name\": \"OpenAI API Key\",\n \"info\": \"The OpenAI API Key to use for the OpenAI model.\",\n \"advanced\": False,\n \"password\": True,\n },\n \"temperature\": {\n \"display_name\": \"Temperature\",\n \"advanced\": False,\n \"value\": 0.1,\n },\n \"stream\": {\n \"display_name\": \"Stream\",\n \"info\": STREAM_INFO_TEXT,\n \"advanced\": True,\n },\n \"system_message\": {\n \"display_name\": \"System Message\",\n \"info\": \"System message to pass to the model.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n input_value: Text,\n openai_api_key: str,\n temperature: float = 0.1,\n model_name: str = \"gpt-4o\",\n max_tokens: Optional[int] = 256,\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n stream: bool = False,\n system_message: Optional[str] = None,\n ) -> Text:\n if not openai_api_base:\n openai_api_base = \"https://api.openai.com/v1\"\n if openai_api_key:\n api_key = SecretStr(openai_api_key)\n else:\n api_key = None\n\n output = ChatOpenAI(\n max_tokens=max_tokens or None,\n model_kwargs=model_kwargs,\n model=model_name,\n base_url=openai_api_base,\n api_key=api_key,\n temperature=temperature,\n )\n\n return self.get_chat_result(output, stream, input_value, system_message)\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "max_tokens": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 256,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "max_tokens",
+ "display_name": "Max Tokens",
+ "advanced": true,
+ "dynamic": false,
+ "info": "The maximum number of tokens to generate. Set to 0 for unlimited tokens.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "model_kwargs": {
+ "type": "NestedDict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": {},
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "model_kwargs",
+ "display_name": "Model Kwargs",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "model_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "gpt-4-1106-preview",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": [
+ "gpt-4o",
+ "gpt-4-turbo",
+ "gpt-4-turbo-preview",
+ "gpt-3.5-turbo",
+ "gpt-3.5-turbo-0125"
+ ],
+ "name": "model_name",
+ "display_name": "Model Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_base": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "openai_api_base",
+ "display_name": "OpenAI API Base",
+ "advanced": true,
+ "dynamic": false,
+ "info": "The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\n\nYou can change this to use other APIs like JinaChat, LocalAI and Prem.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_key": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "openai_api_key",
+ "display_name": "OpenAI API Key",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The OpenAI API Key to use for the OpenAI model.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "OPENAI_API_KEY"
+ },
+ "stream": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "stream",
+ "display_name": "Stream",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Stream the response from the model. Streaming works only in Chat.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "system_message": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "system_message",
+ "display_name": "System Message",
+ "advanced": true,
+ "dynamic": false,
+ "info": "System message to pass to the model.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "temperature": {
+ "type": "float",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "0.2",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "temperature",
+ "display_name": "Temperature",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "rangeSpec": {
+ "step_type": "float",
+ "min": -1,
+ "max": 1,
+ "step": 0.1
+ },
+ "load_from_db": false,
+ "title_case": false
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Generates text using OpenAI LLMs.",
+ "icon": "OpenAI",
+ "base_classes": ["str", "object", "Text"],
+ "display_name": "OpenAI",
+ "documentation": "",
+ "custom_fields": {
+ "input_value": null,
+ "openai_api_key": null,
+ "temperature": null,
+ "model_name": null,
+ "max_tokens": null,
+ "model_kwargs": null,
+ "openai_api_base": null,
+ "stream": null,
+ "system_message": null
+ },
+ "output_types": ["Text"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [
+ "max_tokens",
+ "model_kwargs",
+ "model_name",
+ "openai_api_base",
+ "openai_api_key",
+ "temperature",
+ "input_value",
+ "system_message",
+ "stream"
+ ],
+ "beta": false
+ },
+ "id": "OpenAIModel-9RykF"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 563,
+ "positionAbsolute": {
+ "x": 2561.5850334731617,
+ "y": 553.2745131130916
+ },
+ "dragging": false
+ },
+ {
+ "id": "TextOutput-vrs6T",
+ "type": "genericNode",
+ "position": {
+ "x": 1911.4785906252087,
+ "y": 247.39079954376987
+ },
+ "data": {
+ "type": "TextOutput",
+ "node": {
+ "template": {
+ "input_value": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Value",
+ "advanced": false,
+ "input_types": ["Record", "Text"],
+ "dynamic": false,
+ "info": "Text or Record to be passed as output.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional\n\nfrom langflow.base.io.text import TextComponent\nfrom langflow.field_typing import Text\n\n\nclass TextOutput(TextComponent):\n display_name = \"Text Output\"\n description = \"Display a text output in the Playground.\"\n icon = \"type\"\n\n def build_config(self):\n return {\n \"input_value\": {\n \"display_name\": \"Value\",\n \"input_types\": [\"Record\", \"Text\"],\n \"info\": \"Text or Record to be passed as output.\",\n },\n \"record_template\": {\n \"display_name\": \"Record Template\",\n \"multiline\": True,\n \"info\": \"Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.\",\n \"advanced\": True,\n },\n }\n\n def build(self, input_value: Optional[Text] = \"\", record_template: str = \"\") -> Text:\n return super().build(input_value=input_value, record_template=record_template)\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "record_template": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "record_template",
+ "display_name": "Record Template",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Display a text output in the Playground.",
+ "icon": "type",
+ "base_classes": ["str", "object", "Text"],
+ "display_name": "Inspect Memory",
+ "documentation": "",
+ "custom_fields": {
+ "input_value": null,
+ "record_template": null
+ },
+ "output_types": ["Text"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "TextOutput-vrs6T"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 289,
+ "positionAbsolute": {
+ "x": 1911.4785906252087,
+ "y": 247.39079954376987
+ },
+ "dragging": false
+ }
+ ],
+ "edges": [
+ {
+ "source": "MemoryComponent-cdA1J",
+ "sourceHandle": "{œbaseClassesœ:[œstrœ,œTextœ,œobjectœ],œdataTypeœ:œMemoryComponentœ,œidœ:œMemoryComponent-cdA1Jœ}",
+ "target": "Prompt-ODkUx",
+ "targetHandle": "{œfieldNameœ:œcontextœ,œidœ:œPrompt-ODkUxœ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "context",
+ "type": "str",
+ "id": "Prompt-ODkUx",
+ "inputTypes": ["Document", "BaseOutputParser", "Record", "Text"]
+ },
+ "sourceHandle": {
+ "baseClasses": ["str", "Text", "object"],
+ "dataType": "MemoryComponent",
+ "id": "MemoryComponent-cdA1J"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-MemoryComponent-cdA1J{œbaseClassesœ:[œstrœ,œTextœ,œobjectœ],œdataTypeœ:œMemoryComponentœ,œidœ:œMemoryComponent-cdA1Jœ}-Prompt-ODkUx{œfieldNameœ:œcontextœ,œidœ:œPrompt-ODkUxœ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}",
+ "selected": false
+ },
+ {
+ "source": "ChatInput-t7F8v",
+ "sourceHandle": "{œbaseClassesœ:[œTextœ,œobjectœ,œRecordœ,œstrœ],œdataTypeœ:œChatInputœ,œidœ:œChatInput-t7F8vœ}",
+ "target": "Prompt-ODkUx",
+ "targetHandle": "{œfieldNameœ:œuser_messageœ,œidœ:œPrompt-ODkUxœ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "user_message",
+ "type": "str",
+ "id": "Prompt-ODkUx",
+ "inputTypes": ["Document", "BaseOutputParser", "Record", "Text"]
+ },
+ "sourceHandle": {
+ "baseClasses": ["Text", "object", "Record", "str"],
+ "dataType": "ChatInput",
+ "id": "ChatInput-t7F8v"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-ChatInput-t7F8v{œbaseClassesœ:[œTextœ,œobjectœ,œRecordœ,œstrœ],œdataTypeœ:œChatInputœ,œidœ:œChatInput-t7F8vœ}-Prompt-ODkUx{œfieldNameœ:œuser_messageœ,œidœ:œPrompt-ODkUxœ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}",
+ "selected": false
+ },
+ {
+ "source": "Prompt-ODkUx",
+ "sourceHandle": "{œbaseClassesœ:[œTextœ,œstrœ,œobjectœ],œdataTypeœ:œPromptœ,œidœ:œPrompt-ODkUxœ}",
+ "target": "OpenAIModel-9RykF",
+ "targetHandle": "{œfieldNameœ:œinput_valueœ,œidœ:œOpenAIModel-9RykFœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "OpenAIModel-9RykF",
+ "inputTypes": ["Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["Text", "str", "object"],
+ "dataType": "Prompt",
+ "id": "Prompt-ODkUx"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-Prompt-ODkUx{œbaseClassesœ:[œTextœ,œstrœ,œobjectœ],œdataTypeœ:œPromptœ,œidœ:œPrompt-ODkUxœ}-OpenAIModel-9RykF{œfieldNameœ:œinput_valueœ,œidœ:œOpenAIModel-9RykFœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}"
+ },
+ {
+ "source": "OpenAIModel-9RykF",
+ "sourceHandle": "{œbaseClassesœ:[œstrœ,œobjectœ,œTextœ],œdataTypeœ:œOpenAIModelœ,œidœ:œOpenAIModel-9RykFœ}",
+ "target": "ChatOutput-P1jEe",
+ "targetHandle": "{œfieldNameœ:œinput_valueœ,œidœ:œChatOutput-P1jEeœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "ChatOutput-P1jEe",
+ "inputTypes": ["Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["str", "object", "Text"],
+ "dataType": "OpenAIModel",
+ "id": "OpenAIModel-9RykF"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-OpenAIModel-9RykF{œbaseClassesœ:[œstrœ,œobjectœ,œTextœ],œdataTypeœ:œOpenAIModelœ,œidœ:œOpenAIModel-9RykFœ}-ChatOutput-P1jEe{œfieldNameœ:œinput_valueœ,œidœ:œChatOutput-P1jEeœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}"
+ },
+ {
+ "source": "MemoryComponent-cdA1J",
+ "sourceHandle": "{œbaseClassesœ:[œstrœ,œTextœ,œobjectœ],œdataTypeœ:œMemoryComponentœ,œidœ:œMemoryComponent-cdA1Jœ}",
+ "target": "TextOutput-vrs6T",
+ "targetHandle": "{œfieldNameœ:œinput_valueœ,œidœ:œTextOutput-vrs6Tœ,œinputTypesœ:[œRecordœ,œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "TextOutput-vrs6T",
+ "inputTypes": ["Record", "Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["str", "Text", "object"],
+ "dataType": "MemoryComponent",
+ "id": "MemoryComponent-cdA1J"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-foreground stroke-connection",
+ "id": "reactflow__edge-MemoryComponent-cdA1J{œbaseClassesœ:[œstrœ,œTextœ,œobjectœ],œdataTypeœ:œMemoryComponentœ,œidœ:œMemoryComponent-cdA1Jœ}-TextOutput-vrs6T{œfieldNameœ:œinput_valueœ,œidœ:œTextOutput-vrs6Tœ,œinputTypesœ:[œRecordœ,œTextœ],œtypeœ:œstrœ}"
+ }
+ ],
+ "viewport": {
+ "x": -569.862554459756,
+ "y": -42.08339711050985,
+ "zoom": 0.4868590524514978
+ }
+ },
+ "description": "This project can be used as a starting point for building a Chat experience with user specific memory. You can set a different Session ID to start a new message history.",
+ "name": "Memory Chatbot",
+ "last_tested_version": "1.0.0a0",
+ "is_component": false
}
diff --git a/src/backend/base/langflow/initial_setup/starter_projects/Langflow Prompt Chaining.json b/src/backend/base/langflow/initial_setup/starter_projects/Langflow Prompt Chaining.json
index dd1b1307f..d49eba054 100644
--- a/src/backend/base/langflow/initial_setup/starter_projects/Langflow Prompt Chaining.json
+++ b/src/backend/base/langflow/initial_setup/starter_projects/Langflow Prompt Chaining.json
@@ -1,1769 +1,1586 @@
{
- "id": "85392e54-20f3-4ab5-a179-cb4bef16f639",
- "data": {
- "nodes": [
- {
- "id": "Prompt-amqBu",
- "type": "genericNode",
- "position": {
- "x": 2191.5837146441663,
- "y": 1047.9307944451873
- },
- "data": {
- "type": "Prompt",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from langchain_core.prompts import PromptTemplate\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Prompt, TemplateField, Text\n\n\nclass PromptComponent(CustomComponent):\n display_name: str = \"Prompt\"\n description: str = \"Create a prompt template with dynamic variables.\"\n icon = \"prompts\"\n\n def build_config(self):\n return {\n \"template\": TemplateField(display_name=\"Template\"),\n \"code\": TemplateField(advanced=True),\n }\n\n def build(\n self,\n template: Prompt,\n **kwargs,\n ) -> Text:\n from langflow.base.prompts.utils import dict_values_to_string\n\n prompt_template = PromptTemplate.from_template(Text(template))\n kwargs = dict_values_to_string(kwargs)\n kwargs = {k: \"\\n\".join(v) if isinstance(v, list) else v for k, v in kwargs.items()}\n try:\n formated_prompt = prompt_template.format(**kwargs)\n except Exception as exc:\n raise ValueError(f\"Error formatting prompt: {exc}\") from exc\n self.status = f'Prompt:\\n\"{formated_prompt}\"'\n return formated_prompt\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "template": {
- "type": "prompt",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "You are a helpful assistant. Given a long document, your task is to create a concise summary that captures the main points and key details. The summary should be clear, accurate, and succinct. Please provide the summary in the format below:\n####\n{document}\n####\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "template",
- "display_name": "Template",
- "advanced": false,
- "input_types": [
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "_type": "CustomComponent",
- "document": {
- "field_type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "document",
- "display_name": "document",
- "advanced": false,
- "input_types": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "type": "str"
- }
- },
- "description": "Create a prompt template with dynamic variables.",
- "icon": "prompts",
- "is_input": null,
- "is_output": null,
- "is_composition": null,
- "base_classes": [
- "object",
- "str",
- "Text"
- ],
- "name": "",
- "display_name": "Prompt",
- "documentation": "",
- "custom_fields": {
- "template": [
- "document"
- ]
- },
- "output_types": [
- "Text"
- ],
- "full_path": null,
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false,
- "error": null
- },
- "id": "Prompt-amqBu",
- "description": "Create a prompt template with dynamic variables.",
- "display_name": "Prompt"
- },
- "selected": false,
- "width": 384,
- "height": 385,
- "positionAbsolute": {
- "x": 2191.5837146441663,
- "y": 1047.9307944451873
- },
- "dragging": false
+ "id": "85392e54-20f3-4ab5-a179-cb4bef16f639",
+ "data": {
+ "nodes": [
+ {
+ "id": "Prompt-amqBu",
+ "type": "genericNode",
+ "position": {
+ "x": 2191.5837146441663,
+ "y": 1047.9307944451873
+ },
+ "data": {
+ "type": "Prompt",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from langchain_core.prompts import PromptTemplate\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Prompt, TemplateField, Text\n\n\nclass PromptComponent(CustomComponent):\n display_name: str = \"Prompt\"\n description: str = \"Create a prompt template with dynamic variables.\"\n icon = \"prompts\"\n\n def build_config(self):\n return {\n \"template\": TemplateField(display_name=\"Template\"),\n \"code\": TemplateField(advanced=True),\n }\n\n def build(\n self,\n template: Prompt,\n **kwargs,\n ) -> Text:\n from langflow.base.prompts.utils import dict_values_to_string\n\n prompt_template = PromptTemplate.from_template(Text(template))\n kwargs = dict_values_to_string(kwargs)\n kwargs = {k: \"\\n\".join(v) if isinstance(v, list) else v for k, v in kwargs.items()}\n try:\n formated_prompt = prompt_template.format(**kwargs)\n except Exception as exc:\n raise ValueError(f\"Error formatting prompt: {exc}\") from exc\n self.status = f'Prompt:\\n\"{formated_prompt}\"'\n return formated_prompt\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "template": {
+ "type": "prompt",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "You are a helpful assistant. Given a long document, your task is to create a concise summary that captures the main points and key details. The summary should be clear, accurate, and succinct. Please provide the summary in the format below:\n####\n{document}\n####\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "template",
+ "display_name": "Template",
+ "advanced": false,
+ "input_types": ["Text"],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "_type": "CustomComponent",
+ "document": {
+ "field_type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "document",
+ "display_name": "document",
+ "advanced": false,
+ "input_types": [
+ "Document",
+ "BaseOutputParser",
+ "Record",
+ "Text"
+ ],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "type": "str"
+ }
},
- {
- "id": "Prompt-gTNiz",
- "type": "genericNode",
- "position": {
- "x": 3731.0813766902447,
- "y": 799.631909121391
- },
- "data": {
- "type": "Prompt",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from langchain_core.prompts import PromptTemplate\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Prompt, TemplateField, Text\n\n\nclass PromptComponent(CustomComponent):\n display_name: str = \"Prompt\"\n description: str = \"Create a prompt template with dynamic variables.\"\n icon = \"prompts\"\n\n def build_config(self):\n return {\n \"template\": TemplateField(display_name=\"Template\"),\n \"code\": TemplateField(advanced=True),\n }\n\n def build(\n self,\n template: Prompt,\n **kwargs,\n ) -> Text:\n from langflow.base.prompts.utils import dict_values_to_string\n\n prompt_template = PromptTemplate.from_template(Text(template))\n kwargs = dict_values_to_string(kwargs)\n kwargs = {k: \"\\n\".join(v) if isinstance(v, list) else v for k, v in kwargs.items()}\n try:\n formated_prompt = prompt_template.format(**kwargs)\n except Exception as exc:\n raise ValueError(f\"Error formatting prompt: {exc}\") from exc\n self.status = f'Prompt:\\n\"{formated_prompt}\"'\n return formated_prompt\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "template": {
- "type": "prompt",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "Given a summary of an article, please create two multiple-choice questions that cover the key points and details mentioned. Ensure the questions are clear and provide three options (A, B, C), with one correct answer.\n####\n{summary}\n####",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "template",
- "display_name": "Template",
- "advanced": false,
- "input_types": [
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "_type": "CustomComponent",
- "summary": {
- "field_type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "summary",
- "display_name": "summary",
- "advanced": false,
- "input_types": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "type": "str"
- }
- },
- "description": "Create a prompt template with dynamic variables.",
- "icon": "prompts",
- "is_input": null,
- "is_output": null,
- "is_composition": null,
- "base_classes": [
- "object",
- "str",
- "Text"
- ],
- "name": "",
- "display_name": "Prompt",
- "documentation": "",
- "custom_fields": {
- "template": [
- "summary"
- ]
- },
- "output_types": [
- "Text"
- ],
- "full_path": null,
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false,
- "error": null
- },
- "id": "Prompt-gTNiz",
- "description": "Create a prompt template with dynamic variables.",
- "display_name": "Prompt"
- },
- "selected": false,
- "width": 384,
- "height": 385,
- "dragging": false
+ "description": "Create a prompt template with dynamic variables.",
+ "icon": "prompts",
+ "is_input": null,
+ "is_output": null,
+ "is_composition": null,
+ "base_classes": ["object", "str", "Text"],
+ "name": "",
+ "display_name": "Prompt",
+ "documentation": "",
+ "custom_fields": {
+ "template": ["document"]
},
- {
- "id": "ChatOutput-EJkG3",
- "type": "genericNode",
- "position": {
- "x": 3722.1747844849388,
- "y": 1283.413553222214
- },
- "data": {
- "type": "ChatOutput",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n\n def build(\n self,\n sender: Optional[str] = \"Machine\",\n sender_name: Optional[str] = \"AI\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n record_template: Optional[str] = \"{text}\",\n ) -> Union[Text, Record]:\n return super().build_with_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n record_template=record_template or \"\",\n )\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "input_value": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Message",
- "advanced": false,
- "input_types": [
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "record_template": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "{text}",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "record_template",
- "display_name": "Record Template",
- "advanced": true,
- "dynamic": false,
- "info": "In case of Message being a Record, this template will be used to convert it to text.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "return_record": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "return_record",
- "display_name": "Return Record",
- "advanced": true,
- "dynamic": false,
- "info": "Return the message as a record containing the sender, sender_name, and session_id.",
- "load_from_db": false,
- "title_case": false
- },
- "sender": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "Machine",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "Machine",
- "User"
- ],
- "name": "sender",
- "display_name": "Sender Type",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "sender_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "Summarizer",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "sender_name",
- "display_name": "Sender Name",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "session_id": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "session_id",
- "display_name": "Session ID",
- "advanced": true,
- "dynamic": false,
- "info": "If provided, the message will be stored in the memory.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Display a chat message in the Playground.",
- "icon": "ChatOutput",
- "base_classes": [
- "object",
- "Record",
- "Text",
- "str"
- ],
- "display_name": "Chat Output",
- "documentation": "",
- "custom_fields": {
- "sender": null,
- "sender_name": null,
- "input_value": null,
- "session_id": null,
- "return_record": null,
- "record_template": null
- },
- "output_types": [
- "Text",
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "ChatOutput-EJkG3"
- },
- "selected": false,
- "width": 384,
- "height": 385,
- "dragging": false
+ "output_types": ["Text"],
+ "full_path": null,
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false,
+ "error": null
+ },
+ "id": "Prompt-amqBu",
+ "description": "Create a prompt template with dynamic variables.",
+ "display_name": "Prompt"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 385,
+ "positionAbsolute": {
+ "x": 2191.5837146441663,
+ "y": 1047.9307944451873
+ },
+ "dragging": false
+ },
+ {
+ "id": "Prompt-gTNiz",
+ "type": "genericNode",
+ "position": {
+ "x": 3731.0813766902447,
+ "y": 799.631909121391
+ },
+ "data": {
+ "type": "Prompt",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from langchain_core.prompts import PromptTemplate\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Prompt, TemplateField, Text\n\n\nclass PromptComponent(CustomComponent):\n display_name: str = \"Prompt\"\n description: str = \"Create a prompt template with dynamic variables.\"\n icon = \"prompts\"\n\n def build_config(self):\n return {\n \"template\": TemplateField(display_name=\"Template\"),\n \"code\": TemplateField(advanced=True),\n }\n\n def build(\n self,\n template: Prompt,\n **kwargs,\n ) -> Text:\n from langflow.base.prompts.utils import dict_values_to_string\n\n prompt_template = PromptTemplate.from_template(Text(template))\n kwargs = dict_values_to_string(kwargs)\n kwargs = {k: \"\\n\".join(v) if isinstance(v, list) else v for k, v in kwargs.items()}\n try:\n formated_prompt = prompt_template.format(**kwargs)\n except Exception as exc:\n raise ValueError(f\"Error formatting prompt: {exc}\") from exc\n self.status = f'Prompt:\\n\"{formated_prompt}\"'\n return formated_prompt\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "template": {
+ "type": "prompt",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "Given a summary of an article, please create two multiple-choice questions that cover the key points and details mentioned. Ensure the questions are clear and provide three options (A, B, C), with one correct answer.\n####\n{summary}\n####",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "template",
+ "display_name": "Template",
+ "advanced": false,
+ "input_types": ["Text"],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "_type": "CustomComponent",
+ "summary": {
+ "field_type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "summary",
+ "display_name": "summary",
+ "advanced": false,
+ "input_types": [
+ "Document",
+ "BaseOutputParser",
+ "Record",
+ "Text"
+ ],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "type": "str"
+ }
},
- {
- "id": "ChatOutput-DNmvg",
- "type": "genericNode",
- "position": {
- "x": 5077.71285886074,
- "y": 1232.9152769735522
- },
- "data": {
- "type": "ChatOutput",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n\n def build(\n self,\n sender: Optional[str] = \"Machine\",\n sender_name: Optional[str] = \"AI\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n record_template: Optional[str] = \"{text}\",\n ) -> Union[Text, Record]:\n return super().build_with_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n record_template=record_template or \"\",\n )\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "input_value": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Message",
- "advanced": false,
- "input_types": [
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "record_template": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "{text}",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "record_template",
- "display_name": "Record Template",
- "advanced": true,
- "dynamic": false,
- "info": "In case of Message being a Record, this template will be used to convert it to text.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "return_record": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "return_record",
- "display_name": "Return Record",
- "advanced": true,
- "dynamic": false,
- "info": "Return the message as a record containing the sender, sender_name, and session_id.",
- "load_from_db": false,
- "title_case": false
- },
- "sender": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "Machine",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "Machine",
- "User"
- ],
- "name": "sender",
- "display_name": "Sender Type",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "sender_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "Question Generator",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "sender_name",
- "display_name": "Sender Name",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "session_id": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "session_id",
- "display_name": "Session ID",
- "advanced": true,
- "dynamic": false,
- "info": "If provided, the message will be stored in the memory.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Display a chat message in the Playground.",
- "icon": "ChatOutput",
- "base_classes": [
- "object",
- "Record",
- "Text",
- "str"
- ],
- "display_name": "Chat Output",
- "documentation": "",
- "custom_fields": {
- "sender": null,
- "sender_name": null,
- "input_value": null,
- "session_id": null,
- "return_record": null,
- "record_template": null
- },
- "output_types": [
- "Text",
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "ChatOutput-DNmvg"
- },
- "selected": false,
- "width": 384,
- "height": 385
+ "description": "Create a prompt template with dynamic variables.",
+ "icon": "prompts",
+ "is_input": null,
+ "is_output": null,
+ "is_composition": null,
+ "base_classes": ["object", "str", "Text"],
+ "name": "",
+ "display_name": "Prompt",
+ "documentation": "",
+ "custom_fields": {
+ "template": ["summary"]
},
- {
- "id": "TextInput-sptaH",
- "type": "genericNode",
- "position": {
- "x": 1700.5624822024752,
- "y": 1039.603088937466
- },
- "data": {
- "type": "TextInput",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional\n\nfrom langflow.base.io.text import TextComponent\nfrom langflow.field_typing import Text\n\n\nclass TextInput(TextComponent):\n display_name = \"Text Input\"\n description = \"Get text inputs from the Playground.\"\n icon = \"type\"\n\n def build_config(self):\n return {\n \"input_value\": {\n \"display_name\": \"Value\",\n \"input_types\": [\"Record\", \"Text\"],\n \"info\": \"Text or Record to be passed as input.\",\n },\n \"record_template\": {\n \"display_name\": \"Record Template\",\n \"multiline\": True,\n \"info\": \"Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n input_value: Optional[Text] = \"\",\n record_template: Optional[str] = \"\",\n ) -> Text:\n return super().build(input_value=input_value, record_template=record_template)\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "input_value": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "Revolutionary Nano-Battery Technology Unveiled In a groundbreaking announcement yesterday, researchers from the fictional Tech Innovations Institute revealed the development of a new nano-battery technology that promises to revolutionize energy storage. The new battery, dubbed the \"EnerGCell\", uses advanced nanomaterials to achieve unprecedented efficiency and storage capacities. According to lead researcher Dr. Ada Byron, the EnerGCell can store up to ten times more energy than the best lithium-ion batteries available today, while charging in just a fraction of the time. \"We're talking about charging your electric vehicle in just five minutes for a range of over 1,000 miles,\" Dr. Byron stated during the press conference. The technology behind the EnerGCell involves a complex arrangement of nanostructured electrodes that allow for rapid ion transfer and extremely high energy density. This breakthrough was achieved after a decade of research into nanomaterials and their applications in energy storage. The implications of this technology are vast, promising to accelerate the adoption of renewable energy by making it more practical and affordable to store wind and solar power. It could also lead to significant advancements in electric vehicles, mobile devices, and any other technology that relies on batteries. Despite the excitement, some experts are calling for patience, noting that the EnerGCell is still in its early stages of development and may take several years before it's commercially available. However, the potential impact of such a technology on the environment and the global economy is undeniable. Tech Innovations Institute plans to continue refining the EnerGCell and begin pilot projects with select partners in the coming year. If successful, this nano-battery technology could indeed be the breakthrough needed to usher in a new era of clean energy and technology.",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Value",
- "advanced": false,
- "input_types": [
- "Record",
- "Text"
- ],
- "dynamic": false,
- "info": "Text or Record to be passed as input.",
- "load_from_db": false,
- "title_case": false
- },
- "record_template": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "record_template",
- "display_name": "Record Template",
- "advanced": true,
- "dynamic": false,
- "info": "Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Get text inputs from the Playground.",
- "icon": "type",
- "base_classes": [
- "str",
- "Text",
- "object"
- ],
- "display_name": "Text Input",
- "documentation": "",
- "custom_fields": {
- "input_value": null,
- "record_template": null
- },
- "output_types": [
- "Text"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "TextInput-sptaH"
- },
- "selected": false,
- "width": 384,
- "height": 290,
- "positionAbsolute": {
- "x": 1700.5624822024752,
- "y": 1039.603088937466
- },
- "dragging": false
+ "output_types": ["Text"],
+ "full_path": null,
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false,
+ "error": null
+ },
+ "id": "Prompt-gTNiz",
+ "description": "Create a prompt template with dynamic variables.",
+ "display_name": "Prompt"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 385,
+ "dragging": false
+ },
+ {
+ "id": "ChatOutput-EJkG3",
+ "type": "genericNode",
+ "position": {
+ "x": 3722.1747844849388,
+ "y": 1283.413553222214
+ },
+ "data": {
+ "type": "ChatOutput",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n\n def build(\n self,\n sender: Optional[str] = \"Machine\",\n sender_name: Optional[str] = \"AI\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n record_template: Optional[str] = \"{text}\",\n ) -> Union[Text, Record]:\n return super().build_with_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n record_template=record_template or \"\",\n )\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "input_value": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Message",
+ "advanced": false,
+ "input_types": ["Text"],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "record_template": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "{text}",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "record_template",
+ "display_name": "Record Template",
+ "advanced": true,
+ "dynamic": false,
+ "info": "In case of Message being a Record, this template will be used to convert it to text.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "return_record": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "return_record",
+ "display_name": "Return Record",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Return the message as a record containing the sender, sender_name, and session_id.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "sender": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "Machine",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["Machine", "User"],
+ "name": "sender",
+ "display_name": "Sender Type",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "sender_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "Summarizer",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "sender_name",
+ "display_name": "Sender Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "session_id": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "session_id",
+ "display_name": "Session ID",
+ "advanced": true,
+ "dynamic": false,
+ "info": "If provided, the message will be stored in the memory.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
},
- {
- "id": "TextOutput-2MS4a",
- "type": "genericNode",
- "position": {
- "x": 2917.216113690115,
- "y": 513.0058511435552
- },
- "data": {
- "type": "TextOutput",
- "node": {
- "template": {
- "input_value": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Value",
- "advanced": false,
- "input_types": [
- "Record",
- "Text"
- ],
- "dynamic": false,
- "info": "Text or Record to be passed as output.",
- "load_from_db": false,
- "title_case": false
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional\n\nfrom langflow.base.io.text import TextComponent\nfrom langflow.field_typing import Text\n\n\nclass TextOutput(TextComponent):\n display_name = \"Text Output\"\n description = \"Display a text output in the Playground.\"\n icon = \"type\"\n\n def build_config(self):\n return {\n \"input_value\": {\n \"display_name\": \"Value\",\n \"input_types\": [\"Record\", \"Text\"],\n \"info\": \"Text or Record to be passed as output.\",\n },\n \"record_template\": {\n \"display_name\": \"Record Template\",\n \"multiline\": True,\n \"info\": \"Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.\",\n \"advanced\": True,\n },\n }\n\n def build(self, input_value: Optional[Text] = \"\", record_template: str = \"\") -> Text:\n return super().build(input_value=input_value, record_template=record_template)\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "record_template": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "record_template",
- "display_name": "Record Template",
- "advanced": true,
- "dynamic": false,
- "info": "Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Display a text output in the Playground.",
- "icon": "type",
- "base_classes": [
- "str",
- "Text",
- "object"
- ],
- "display_name": "First Prompt",
- "documentation": "",
- "custom_fields": {
- "input_value": null,
- "record_template": null
- },
- "output_types": [
- "Text"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "TextOutput-2MS4a"
- },
- "selected": false,
- "width": 384,
- "height": 290,
- "positionAbsolute": {
- "x": 2917.216113690115,
- "y": 513.0058511435552
- },
- "dragging": false
+ "description": "Display a chat message in the Playground.",
+ "icon": "ChatOutput",
+ "base_classes": ["object", "Record", "Text", "str"],
+ "display_name": "Chat Output",
+ "documentation": "",
+ "custom_fields": {
+ "sender": null,
+ "sender_name": null,
+ "input_value": null,
+ "session_id": null,
+ "return_record": null,
+ "record_template": null
},
- {
- "id": "OpenAIModel-uYXZJ",
- "type": "genericNode",
- "position": {
- "x": 2925.784767523062,
- "y": 933.6465680967775
- },
- "data": {
- "type": "OpenAIModel",
- "node": {
- "template": {
- "input_value": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Input",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional\n\nfrom langchain_openai import ChatOpenAI\nfrom pydantic.v1 import SecretStr\n\nfrom langflow.base.constants import STREAM_INFO_TEXT\nfrom langflow.base.models.model import LCModelComponent\nfrom langflow.base.models.openai_constants import MODEL_NAMES\nfrom langflow.field_typing import NestedDict, Text\n\n\nclass OpenAIModelComponent(LCModelComponent):\n display_name = \"OpenAI\"\n description = \"Generates text using OpenAI LLMs.\"\n icon = \"OpenAI\"\n\n field_order = [\n \"max_tokens\",\n \"model_kwargs\",\n \"model_name\",\n \"openai_api_base\",\n \"openai_api_key\",\n \"temperature\",\n \"input_value\",\n \"system_message\",\n \"stream\",\n ]\n\n def build_config(self):\n return {\n \"input_value\": {\"display_name\": \"Input\"},\n \"max_tokens\": {\n \"display_name\": \"Max Tokens\",\n \"advanced\": True,\n \"info\": \"The maximum number of tokens to generate. Set to 0 for unlimited tokens.\",\n },\n \"model_kwargs\": {\n \"display_name\": \"Model Kwargs\",\n \"advanced\": True,\n },\n \"model_name\": {\n \"display_name\": \"Model Name\",\n \"advanced\": False,\n \"options\": MODEL_NAMES,\n },\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"advanced\": True,\n \"info\": (\n \"The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\\n\\n\"\n \"You can change this to use other APIs like JinaChat, LocalAI and Prem.\"\n ),\n },\n \"openai_api_key\": {\n \"display_name\": \"OpenAI API Key\",\n \"info\": \"The OpenAI API Key to use for the OpenAI model.\",\n \"advanced\": False,\n \"password\": True,\n },\n \"temperature\": {\n \"display_name\": \"Temperature\",\n \"advanced\": False,\n \"value\": 0.1,\n },\n \"stream\": {\n \"display_name\": \"Stream\",\n \"info\": STREAM_INFO_TEXT,\n \"advanced\": True,\n },\n \"system_message\": {\n \"display_name\": \"System Message\",\n \"info\": \"System message to pass to the model.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n input_value: Text,\n openai_api_key: str,\n temperature: float,\n model_name: str = \"gpt-4o\",\n max_tokens: Optional[int] = 256,\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n stream: bool = False,\n system_message: Optional[str] = None,\n ) -> Text:\n if not openai_api_base:\n openai_api_base = \"https://api.openai.com/v1\"\n if openai_api_key:\n api_key = SecretStr(openai_api_key)\n else:\n api_key = None\n\n output = ChatOpenAI(\n max_tokens=max_tokens or None,\n model_kwargs=model_kwargs,\n model=model_name,\n base_url=openai_api_base,\n api_key=api_key,\n temperature=temperature,\n )\n\n return self.get_chat_result(output, stream, input_value, system_message)\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "max_tokens": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 256,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "max_tokens",
- "display_name": "Max Tokens",
- "advanced": true,
- "dynamic": false,
- "info": "The maximum number of tokens to generate. Set to 0 for unlimited tokens.",
- "load_from_db": false,
- "title_case": false
- },
- "model_kwargs": {
- "type": "NestedDict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": {},
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "model_kwargs",
- "display_name": "Model Kwargs",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "model_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "gpt-4-turbo-preview",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "gpt-4o",
- "gpt-4-turbo",
- "gpt-4-turbo-preview",
- "gpt-3.5-turbo",
- "gpt-3.5-turbo-0125"
- ],
- "name": "model_name",
- "display_name": "Model Name",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_base": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "openai_api_base",
- "display_name": "OpenAI API Base",
- "advanced": true,
- "dynamic": false,
- "info": "The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\n\nYou can change this to use other APIs like JinaChat, LocalAI and Prem.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_key": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "openai_api_key",
- "display_name": "OpenAI API Key",
- "advanced": false,
- "dynamic": false,
- "info": "The OpenAI API Key to use for the OpenAI model.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "OPENAI_API_KEY"
- },
- "stream": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "stream",
- "display_name": "Stream",
- "advanced": true,
- "dynamic": false,
- "info": "Stream the response from the model. Streaming works only in Chat.",
- "load_from_db": false,
- "title_case": false
- },
- "system_message": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "system_message",
- "display_name": "System Message",
- "advanced": true,
- "dynamic": false,
- "info": "System message to pass to the model.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "temperature": {
- "type": "float",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 0.1,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "temperature",
- "display_name": "Temperature",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "rangeSpec": {
- "step_type": "float",
- "min": -1,
- "max": 1,
- "step": 0.1
- },
- "load_from_db": false,
- "title_case": false
- },
- "_type": "CustomComponent"
- },
- "description": "Generates text using OpenAI LLMs.",
- "icon": "OpenAI",
- "base_classes": [
- "str",
- "Text",
- "object"
- ],
- "display_name": "OpenAI",
- "documentation": "",
- "custom_fields": {
- "input_value": null,
- "openai_api_key": null,
- "temperature": null,
- "model_name": null,
- "max_tokens": null,
- "model_kwargs": null,
- "openai_api_base": null,
- "stream": null,
- "system_message": null
- },
- "output_types": [
- "Text"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [
- "max_tokens",
- "model_kwargs",
- "model_name",
- "openai_api_base",
- "openai_api_key",
- "temperature",
- "input_value",
- "system_message",
- "stream"
- ],
- "beta": false
- },
- "id": "OpenAIModel-uYXZJ"
- },
- "selected": false,
- "width": 384,
- "height": 565,
- "positionAbsolute": {
- "x": 2925.784767523062,
- "y": 933.6465680967775
- },
- "dragging": false
+ "output_types": ["Text", "Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "ChatOutput-EJkG3"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 385,
+ "dragging": false
+ },
+ {
+ "id": "ChatOutput-DNmvg",
+ "type": "genericNode",
+ "position": {
+ "x": 5077.71285886074,
+ "y": 1232.9152769735522
+ },
+ "data": {
+ "type": "ChatOutput",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n\n def build(\n self,\n sender: Optional[str] = \"Machine\",\n sender_name: Optional[str] = \"AI\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n record_template: Optional[str] = \"{text}\",\n ) -> Union[Text, Record]:\n return super().build_with_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n record_template=record_template or \"\",\n )\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "input_value": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Message",
+ "advanced": false,
+ "input_types": ["Text"],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "record_template": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "{text}",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "record_template",
+ "display_name": "Record Template",
+ "advanced": true,
+ "dynamic": false,
+ "info": "In case of Message being a Record, this template will be used to convert it to text.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "return_record": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "return_record",
+ "display_name": "Return Record",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Return the message as a record containing the sender, sender_name, and session_id.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "sender": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "Machine",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["Machine", "User"],
+ "name": "sender",
+ "display_name": "Sender Type",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "sender_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "Question Generator",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "sender_name",
+ "display_name": "Sender Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "session_id": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "session_id",
+ "display_name": "Session ID",
+ "advanced": true,
+ "dynamic": false,
+ "info": "If provided, the message will be stored in the memory.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
},
- {
- "id": "TextOutput-MUDOR",
- "type": "genericNode",
- "position": {
- "x": 4446.064323520379,
- "y": 633.833297518702
- },
- "data": {
- "type": "TextOutput",
- "node": {
- "template": {
- "input_value": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Value",
- "advanced": false,
- "input_types": [
- "Record",
- "Text"
- ],
- "dynamic": false,
- "info": "Text or Record to be passed as output.",
- "load_from_db": false,
- "title_case": false
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional\n\nfrom langflow.base.io.text import TextComponent\nfrom langflow.field_typing import Text\n\n\nclass TextOutput(TextComponent):\n display_name = \"Text Output\"\n description = \"Display a text output in the Playground.\"\n icon = \"type\"\n\n def build_config(self):\n return {\n \"input_value\": {\n \"display_name\": \"Value\",\n \"input_types\": [\"Record\", \"Text\"],\n \"info\": \"Text or Record to be passed as output.\",\n },\n \"record_template\": {\n \"display_name\": \"Record Template\",\n \"multiline\": True,\n \"info\": \"Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.\",\n \"advanced\": True,\n },\n }\n\n def build(self, input_value: Optional[Text] = \"\", record_template: str = \"\") -> Text:\n return super().build(input_value=input_value, record_template=record_template)\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "record_template": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "record_template",
- "display_name": "Record Template",
- "advanced": true,
- "dynamic": false,
- "info": "Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Display a text output in the Playground.",
- "icon": "type",
- "base_classes": [
- "str",
- "Text",
- "object"
- ],
- "display_name": "Second Prompt",
- "documentation": "",
- "custom_fields": {
- "input_value": null,
- "record_template": null
- },
- "output_types": [
- "Text"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "TextOutput-MUDOR"
- },
- "selected": false,
- "width": 384,
- "height": 290,
- "dragging": false,
- "positionAbsolute": {
- "x": 4446.064323520379,
- "y": 633.833297518702
- }
+ "description": "Display a chat message in the Playground.",
+ "icon": "ChatOutput",
+ "base_classes": ["object", "Record", "Text", "str"],
+ "display_name": "Chat Output",
+ "documentation": "",
+ "custom_fields": {
+ "sender": null,
+ "sender_name": null,
+ "input_value": null,
+ "session_id": null,
+ "return_record": null,
+ "record_template": null
},
- {
- "id": "OpenAIModel-XawYB",
- "type": "genericNode",
- "position": {
- "x": 4500.152018344182,
- "y": 1027.7382026227656
- },
- "data": {
- "type": "OpenAIModel",
- "node": {
- "template": {
- "input_value": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Input",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional\n\nfrom langchain_openai import ChatOpenAI\nfrom pydantic.v1 import SecretStr\n\nfrom langflow.base.constants import STREAM_INFO_TEXT\nfrom langflow.base.models.model import LCModelComponent\nfrom langflow.base.models.openai_constants import MODEL_NAMES\nfrom langflow.field_typing import NestedDict, Text\n\n\nclass OpenAIModelComponent(LCModelComponent):\n display_name = \"OpenAI\"\n description = \"Generates text using OpenAI LLMs.\"\n icon = \"OpenAI\"\n\n field_order = [\n \"max_tokens\",\n \"model_kwargs\",\n \"model_name\",\n \"openai_api_base\",\n \"openai_api_key\",\n \"temperature\",\n \"input_value\",\n \"system_message\",\n \"stream\",\n ]\n\n def build_config(self):\n return {\n \"input_value\": {\"display_name\": \"Input\"},\n \"max_tokens\": {\n \"display_name\": \"Max Tokens\",\n \"advanced\": True,\n \"info\": \"The maximum number of tokens to generate. Set to 0 for unlimited tokens.\",\n },\n \"model_kwargs\": {\n \"display_name\": \"Model Kwargs\",\n \"advanced\": True,\n },\n \"model_name\": {\n \"display_name\": \"Model Name\",\n \"advanced\": False,\n \"options\": MODEL_NAMES,\n },\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"advanced\": True,\n \"info\": (\n \"The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\\n\\n\"\n \"You can change this to use other APIs like JinaChat, LocalAI and Prem.\"\n ),\n },\n \"openai_api_key\": {\n \"display_name\": \"OpenAI API Key\",\n \"info\": \"The OpenAI API Key to use for the OpenAI model.\",\n \"advanced\": False,\n \"password\": True,\n },\n \"temperature\": {\n \"display_name\": \"Temperature\",\n \"advanced\": False,\n \"value\": 0.1,\n },\n \"stream\": {\n \"display_name\": \"Stream\",\n \"info\": STREAM_INFO_TEXT,\n \"advanced\": True,\n },\n \"system_message\": {\n \"display_name\": \"System Message\",\n \"info\": \"System message to pass to the model.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n input_value: Text,\n openai_api_key: str,\n temperature: float,\n model_name: str = \"gpt-4o\",\n max_tokens: Optional[int] = 256,\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n stream: bool = False,\n system_message: Optional[str] = None,\n ) -> Text:\n if not openai_api_base:\n openai_api_base = \"https://api.openai.com/v1\"\n if openai_api_key:\n api_key = SecretStr(openai_api_key)\n else:\n api_key = None\n\n output = ChatOpenAI(\n max_tokens=max_tokens or None,\n model_kwargs=model_kwargs,\n model=model_name,\n base_url=openai_api_base,\n api_key=api_key,\n temperature=temperature,\n )\n\n return self.get_chat_result(output, stream, input_value, system_message)\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "max_tokens": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 256,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "max_tokens",
- "display_name": "Max Tokens",
- "advanced": true,
- "dynamic": false,
- "info": "The maximum number of tokens to generate. Set to 0 for unlimited tokens.",
- "load_from_db": false,
- "title_case": false
- },
- "model_kwargs": {
- "type": "NestedDict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": {},
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "model_kwargs",
- "display_name": "Model Kwargs",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "model_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "gpt-4-turbo-preview",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "gpt-4o",
- "gpt-4-turbo",
- "gpt-4-turbo-preview",
- "gpt-3.5-turbo",
- "gpt-3.5-turbo-0125"
- ],
- "name": "model_name",
- "display_name": "Model Name",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_base": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "openai_api_base",
- "display_name": "OpenAI API Base",
- "advanced": true,
- "dynamic": false,
- "info": "The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\n\nYou can change this to use other APIs like JinaChat, LocalAI and Prem.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_key": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "openai_api_key",
- "display_name": "OpenAI API Key",
- "advanced": false,
- "dynamic": false,
- "info": "The OpenAI API Key to use for the OpenAI model.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": ""
- },
- "stream": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "stream",
- "display_name": "Stream",
- "advanced": true,
- "dynamic": false,
- "info": "Stream the response from the model. Streaming works only in Chat.",
- "load_from_db": false,
- "title_case": false
- },
- "system_message": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "system_message",
- "display_name": "System Message",
- "advanced": true,
- "dynamic": false,
- "info": "System message to pass to the model.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "temperature": {
- "type": "float",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 0.1,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "temperature",
- "display_name": "Temperature",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "rangeSpec": {
- "step_type": "float",
- "min": -1,
- "max": 1,
- "step": 0.1
- },
- "load_from_db": false,
- "title_case": false
- },
- "_type": "CustomComponent"
- },
- "description": "Generates text using OpenAI LLMs.",
- "icon": "OpenAI",
- "base_classes": [
- "str",
- "Text",
- "object"
- ],
- "display_name": "OpenAI",
- "documentation": "",
- "custom_fields": {
- "input_value": null,
- "openai_api_key": null,
- "temperature": null,
- "model_name": null,
- "max_tokens": null,
- "model_kwargs": null,
- "openai_api_base": null,
- "stream": null,
- "system_message": null
- },
- "output_types": [
- "Text"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [
- "max_tokens",
- "model_kwargs",
- "model_name",
- "openai_api_base",
- "openai_api_key",
- "temperature",
- "input_value",
- "system_message",
- "stream"
- ],
- "beta": false
- },
- "id": "OpenAIModel-XawYB"
- },
- "selected": false,
- "width": 384,
- "height": 565,
- "positionAbsolute": {
- "x": 4500.152018344182,
- "y": 1027.7382026227656
- },
- "dragging": false
- }
- ],
- "edges": [
- {
- "source": "TextInput-sptaH",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153str\u0153,\u0153Text\u0153,\u0153object\u0153],\u0153dataType\u0153:\u0153TextInput\u0153,\u0153id\u0153:\u0153TextInput-sptaH\u0153}",
- "target": "Prompt-amqBu",
- "targetHandle": "{\u0153fieldName\u0153:\u0153document\u0153,\u0153id\u0153:\u0153Prompt-amqBu\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "document",
- "id": "Prompt-amqBu",
- "inputTypes": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "str",
- "Text",
- "object"
- ],
- "dataType": "TextInput",
- "id": "TextInput-sptaH"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-TextInput-sptaH{\u0153baseClasses\u0153:[\u0153str\u0153,\u0153Text\u0153,\u0153object\u0153],\u0153dataType\u0153:\u0153TextInput\u0153,\u0153id\u0153:\u0153TextInput-sptaH\u0153}-Prompt-amqBu{\u0153fieldName\u0153:\u0153document\u0153,\u0153id\u0153:\u0153Prompt-amqBu\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
+ "output_types": ["Text", "Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "ChatOutput-DNmvg"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 385
+ },
+ {
+ "id": "TextInput-sptaH",
+ "type": "genericNode",
+ "position": {
+ "x": 1700.5624822024752,
+ "y": 1039.603088937466
+ },
+ "data": {
+ "type": "TextInput",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional\n\nfrom langflow.base.io.text import TextComponent\nfrom langflow.field_typing import Text\n\n\nclass TextInput(TextComponent):\n display_name = \"Text Input\"\n description = \"Get text inputs from the Playground.\"\n icon = \"type\"\n\n def build_config(self):\n return {\n \"input_value\": {\n \"display_name\": \"Value\",\n \"input_types\": [\"Record\", \"Text\"],\n \"info\": \"Text or Record to be passed as input.\",\n },\n \"record_template\": {\n \"display_name\": \"Record Template\",\n \"multiline\": True,\n \"info\": \"Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n input_value: Optional[Text] = \"\",\n record_template: Optional[str] = \"\",\n ) -> Text:\n return super().build(input_value=input_value, record_template=record_template)\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "input_value": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "Revolutionary Nano-Battery Technology Unveiled In a groundbreaking announcement yesterday, researchers from the fictional Tech Innovations Institute revealed the development of a new nano-battery technology that promises to revolutionize energy storage. The new battery, dubbed the \"EnerGCell\", uses advanced nanomaterials to achieve unprecedented efficiency and storage capacities. According to lead researcher Dr. Ada Byron, the EnerGCell can store up to ten times more energy than the best lithium-ion batteries available today, while charging in just a fraction of the time. \"We're talking about charging your electric vehicle in just five minutes for a range of over 1,000 miles,\" Dr. Byron stated during the press conference. The technology behind the EnerGCell involves a complex arrangement of nanostructured electrodes that allow for rapid ion transfer and extremely high energy density. This breakthrough was achieved after a decade of research into nanomaterials and their applications in energy storage. The implications of this technology are vast, promising to accelerate the adoption of renewable energy by making it more practical and affordable to store wind and solar power. It could also lead to significant advancements in electric vehicles, mobile devices, and any other technology that relies on batteries. Despite the excitement, some experts are calling for patience, noting that the EnerGCell is still in its early stages of development and may take several years before it's commercially available. However, the potential impact of such a technology on the environment and the global economy is undeniable. Tech Innovations Institute plans to continue refining the EnerGCell and begin pilot projects with select partners in the coming year. If successful, this nano-battery technology could indeed be the breakthrough needed to usher in a new era of clean energy and technology.",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Value",
+ "advanced": false,
+ "input_types": ["Record", "Text"],
+ "dynamic": false,
+ "info": "Text or Record to be passed as input.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "record_template": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "record_template",
+ "display_name": "Record Template",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
},
- {
- "source": "Prompt-amqBu",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153str\u0153,\u0153Text\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-amqBu\u0153}",
- "target": "TextOutput-2MS4a",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153TextOutput-2MS4a\u0153,\u0153inputTypes\u0153:[\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "TextOutput-2MS4a",
- "inputTypes": [
- "Record",
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "object",
- "str",
- "Text"
- ],
- "dataType": "Prompt",
- "id": "Prompt-amqBu"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-Prompt-amqBu{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153str\u0153,\u0153Text\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-amqBu\u0153}-TextOutput-2MS4a{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153TextOutput-2MS4a\u0153,\u0153inputTypes\u0153:[\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
+ "description": "Get text inputs from the Playground.",
+ "icon": "type",
+ "base_classes": ["str", "Text", "object"],
+ "display_name": "Text Input",
+ "documentation": "",
+ "custom_fields": {
+ "input_value": null,
+ "record_template": null
},
- {
- "source": "Prompt-amqBu",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153str\u0153,\u0153Text\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-amqBu\u0153}",
- "target": "OpenAIModel-uYXZJ",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153OpenAIModel-uYXZJ\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "OpenAIModel-uYXZJ",
- "inputTypes": [
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "object",
- "str",
- "Text"
- ],
- "dataType": "Prompt",
- "id": "Prompt-amqBu"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-Prompt-amqBu{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153str\u0153,\u0153Text\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-amqBu\u0153}-OpenAIModel-uYXZJ{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153OpenAIModel-uYXZJ\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
+ "output_types": ["Text"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "TextInput-sptaH"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 290,
+ "positionAbsolute": {
+ "x": 1700.5624822024752,
+ "y": 1039.603088937466
+ },
+ "dragging": false
+ },
+ {
+ "id": "TextOutput-2MS4a",
+ "type": "genericNode",
+ "position": {
+ "x": 2917.216113690115,
+ "y": 513.0058511435552
+ },
+ "data": {
+ "type": "TextOutput",
+ "node": {
+ "template": {
+ "input_value": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Value",
+ "advanced": false,
+ "input_types": ["Record", "Text"],
+ "dynamic": false,
+ "info": "Text or Record to be passed as output.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional\n\nfrom langflow.base.io.text import TextComponent\nfrom langflow.field_typing import Text\n\n\nclass TextOutput(TextComponent):\n display_name = \"Text Output\"\n description = \"Display a text output in the Playground.\"\n icon = \"type\"\n\n def build_config(self):\n return {\n \"input_value\": {\n \"display_name\": \"Value\",\n \"input_types\": [\"Record\", \"Text\"],\n \"info\": \"Text or Record to be passed as output.\",\n },\n \"record_template\": {\n \"display_name\": \"Record Template\",\n \"multiline\": True,\n \"info\": \"Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.\",\n \"advanced\": True,\n },\n }\n\n def build(self, input_value: Optional[Text] = \"\", record_template: str = \"\") -> Text:\n return super().build(input_value=input_value, record_template=record_template)\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "record_template": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "record_template",
+ "display_name": "Record Template",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
},
- {
- "source": "OpenAIModel-uYXZJ",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153str\u0153,\u0153Text\u0153,\u0153object\u0153],\u0153dataType\u0153:\u0153OpenAIModel\u0153,\u0153id\u0153:\u0153OpenAIModel-uYXZJ\u0153}",
- "target": "Prompt-gTNiz",
- "targetHandle": "{\u0153fieldName\u0153:\u0153summary\u0153,\u0153id\u0153:\u0153Prompt-gTNiz\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "summary",
- "id": "Prompt-gTNiz",
- "inputTypes": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "str",
- "Text",
- "object"
- ],
- "dataType": "OpenAIModel",
- "id": "OpenAIModel-uYXZJ"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-OpenAIModel-uYXZJ{\u0153baseClasses\u0153:[\u0153str\u0153,\u0153Text\u0153,\u0153object\u0153],\u0153dataType\u0153:\u0153OpenAIModel\u0153,\u0153id\u0153:\u0153OpenAIModel-uYXZJ\u0153}-Prompt-gTNiz{\u0153fieldName\u0153:\u0153summary\u0153,\u0153id\u0153:\u0153Prompt-gTNiz\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
+ "description": "Display a text output in the Playground.",
+ "icon": "type",
+ "base_classes": ["str", "Text", "object"],
+ "display_name": "First Prompt",
+ "documentation": "",
+ "custom_fields": {
+ "input_value": null,
+ "record_template": null
},
- {
- "source": "OpenAIModel-uYXZJ",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153str\u0153,\u0153Text\u0153,\u0153object\u0153],\u0153dataType\u0153:\u0153OpenAIModel\u0153,\u0153id\u0153:\u0153OpenAIModel-uYXZJ\u0153}",
- "target": "ChatOutput-EJkG3",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153ChatOutput-EJkG3\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "ChatOutput-EJkG3",
- "inputTypes": [
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "str",
- "Text",
- "object"
- ],
- "dataType": "OpenAIModel",
- "id": "OpenAIModel-uYXZJ"
- }
+ "output_types": ["Text"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "TextOutput-2MS4a"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 290,
+ "positionAbsolute": {
+ "x": 2917.216113690115,
+ "y": 513.0058511435552
+ },
+ "dragging": false
+ },
+ {
+ "id": "OpenAIModel-uYXZJ",
+ "type": "genericNode",
+ "position": {
+ "x": 2925.784767523062,
+ "y": 933.6465680967775
+ },
+ "data": {
+ "type": "OpenAIModel",
+ "node": {
+ "template": {
+ "input_value": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Input",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional\n\nfrom langchain_openai import ChatOpenAI\nfrom pydantic.v1 import SecretStr\n\nfrom langflow.base.constants import STREAM_INFO_TEXT\nfrom langflow.base.models.model import LCModelComponent\nfrom langflow.base.models.openai_constants import MODEL_NAMES\nfrom langflow.field_typing import NestedDict, Text\n\n\nclass OpenAIModelComponent(LCModelComponent):\n display_name = \"OpenAI\"\n description = \"Generates text using OpenAI LLMs.\"\n icon = \"OpenAI\"\n\n field_order = [\n \"max_tokens\",\n \"model_kwargs\",\n \"model_name\",\n \"openai_api_base\",\n \"openai_api_key\",\n \"temperature\",\n \"input_value\",\n \"system_message\",\n \"stream\",\n ]\n\n def build_config(self):\n return {\n \"input_value\": {\"display_name\": \"Input\"},\n \"max_tokens\": {\n \"display_name\": \"Max Tokens\",\n \"advanced\": True,\n \"info\": \"The maximum number of tokens to generate. Set to 0 for unlimited tokens.\",\n },\n \"model_kwargs\": {\n \"display_name\": \"Model Kwargs\",\n \"advanced\": True,\n },\n \"model_name\": {\n \"display_name\": \"Model Name\",\n \"advanced\": False,\n \"options\": MODEL_NAMES,\n },\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"advanced\": True,\n \"info\": (\n \"The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\\n\\n\"\n \"You can change this to use other APIs like JinaChat, LocalAI and Prem.\"\n ),\n },\n \"openai_api_key\": {\n \"display_name\": \"OpenAI API Key\",\n \"info\": \"The OpenAI API Key to use for the OpenAI model.\",\n \"advanced\": False,\n \"password\": True,\n },\n \"temperature\": {\n \"display_name\": \"Temperature\",\n \"advanced\": False,\n \"value\": 0.1,\n },\n \"stream\": {\n \"display_name\": \"Stream\",\n \"info\": STREAM_INFO_TEXT,\n \"advanced\": True,\n },\n \"system_message\": {\n \"display_name\": \"System Message\",\n \"info\": \"System message to pass to the model.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n input_value: Text,\n openai_api_key: str,\n temperature: float = 0.1,\n model_name: str = \"gpt-4o\",\n max_tokens: Optional[int] = 256,\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n stream: bool = False,\n system_message: Optional[str] = None,\n ) -> Text:\n if not openai_api_base:\n openai_api_base = \"https://api.openai.com/v1\"\n if openai_api_key:\n api_key = SecretStr(openai_api_key)\n else:\n api_key = None\n\n output = ChatOpenAI(\n max_tokens=max_tokens or None,\n model_kwargs=model_kwargs,\n model=model_name,\n base_url=openai_api_base,\n api_key=api_key,\n temperature=temperature,\n )\n\n return self.get_chat_result(output, stream, input_value, system_message)\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "max_tokens": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 256,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "max_tokens",
+ "display_name": "Max Tokens",
+ "advanced": true,
+ "dynamic": false,
+ "info": "The maximum number of tokens to generate. Set to 0 for unlimited tokens.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "model_kwargs": {
+ "type": "NestedDict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": {},
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "model_kwargs",
+ "display_name": "Model Kwargs",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "model_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "gpt-4-turbo-preview",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": [
+ "gpt-4o",
+ "gpt-4-turbo",
+ "gpt-4-turbo-preview",
+ "gpt-3.5-turbo",
+ "gpt-3.5-turbo-0125"
+ ],
+ "name": "model_name",
+ "display_name": "Model Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_base": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "openai_api_base",
+ "display_name": "OpenAI API Base",
+ "advanced": true,
+ "dynamic": false,
+ "info": "The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\n\nYou can change this to use other APIs like JinaChat, LocalAI and Prem.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_key": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "openai_api_key",
+ "display_name": "OpenAI API Key",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The OpenAI API Key to use for the OpenAI model.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "OPENAI_API_KEY"
+ },
+ "stream": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "stream",
+ "display_name": "Stream",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Stream the response from the model. Streaming works only in Chat.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "system_message": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "system_message",
+ "display_name": "System Message",
+ "advanced": true,
+ "dynamic": false,
+ "info": "System message to pass to the model.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "temperature": {
+ "type": "float",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 0.1,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "temperature",
+ "display_name": "Temperature",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "rangeSpec": {
+ "step_type": "float",
+ "min": -1,
+ "max": 1,
+ "step": 0.1
},
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-OpenAIModel-uYXZJ{\u0153baseClasses\u0153:[\u0153str\u0153,\u0153Text\u0153,\u0153object\u0153],\u0153dataType\u0153:\u0153OpenAIModel\u0153,\u0153id\u0153:\u0153OpenAIModel-uYXZJ\u0153}-ChatOutput-EJkG3{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153ChatOutput-EJkG3\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
+ "load_from_db": false,
+ "title_case": false
+ },
+ "_type": "CustomComponent"
},
- {
- "source": "Prompt-gTNiz",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153str\u0153,\u0153Text\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-gTNiz\u0153}",
- "target": "TextOutput-MUDOR",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153TextOutput-MUDOR\u0153,\u0153inputTypes\u0153:[\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "TextOutput-MUDOR",
- "inputTypes": [
- "Record",
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "object",
- "str",
- "Text"
- ],
- "dataType": "Prompt",
- "id": "Prompt-gTNiz"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-Prompt-gTNiz{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153str\u0153,\u0153Text\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-gTNiz\u0153}-TextOutput-MUDOR{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153TextOutput-MUDOR\u0153,\u0153inputTypes\u0153:[\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
+ "description": "Generates text using OpenAI LLMs.",
+ "icon": "OpenAI",
+ "base_classes": ["str", "Text", "object"],
+ "display_name": "OpenAI",
+ "documentation": "",
+ "custom_fields": {
+ "input_value": null,
+ "openai_api_key": null,
+ "temperature": null,
+ "model_name": null,
+ "max_tokens": null,
+ "model_kwargs": null,
+ "openai_api_base": null,
+ "stream": null,
+ "system_message": null
},
- {
- "source": "Prompt-gTNiz",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153str\u0153,\u0153Text\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-gTNiz\u0153}",
- "target": "OpenAIModel-XawYB",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153OpenAIModel-XawYB\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "OpenAIModel-XawYB",
- "inputTypes": [
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "object",
- "str",
- "Text"
- ],
- "dataType": "Prompt",
- "id": "Prompt-gTNiz"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-Prompt-gTNiz{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153str\u0153,\u0153Text\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-gTNiz\u0153}-OpenAIModel-XawYB{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153OpenAIModel-XawYB\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
+ "output_types": ["Text"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [
+ "max_tokens",
+ "model_kwargs",
+ "model_name",
+ "openai_api_base",
+ "openai_api_key",
+ "temperature",
+ "input_value",
+ "system_message",
+ "stream"
+ ],
+ "beta": false
+ },
+ "id": "OpenAIModel-uYXZJ"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 565,
+ "positionAbsolute": {
+ "x": 2925.784767523062,
+ "y": 933.6465680967775
+ },
+ "dragging": false
+ },
+ {
+ "id": "TextOutput-MUDOR",
+ "type": "genericNode",
+ "position": {
+ "x": 4446.064323520379,
+ "y": 633.833297518702
+ },
+ "data": {
+ "type": "TextOutput",
+ "node": {
+ "template": {
+ "input_value": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Value",
+ "advanced": false,
+ "input_types": ["Record", "Text"],
+ "dynamic": false,
+ "info": "Text or Record to be passed as output.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional\n\nfrom langflow.base.io.text import TextComponent\nfrom langflow.field_typing import Text\n\n\nclass TextOutput(TextComponent):\n display_name = \"Text Output\"\n description = \"Display a text output in the Playground.\"\n icon = \"type\"\n\n def build_config(self):\n return {\n \"input_value\": {\n \"display_name\": \"Value\",\n \"input_types\": [\"Record\", \"Text\"],\n \"info\": \"Text or Record to be passed as output.\",\n },\n \"record_template\": {\n \"display_name\": \"Record Template\",\n \"multiline\": True,\n \"info\": \"Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.\",\n \"advanced\": True,\n },\n }\n\n def build(self, input_value: Optional[Text] = \"\", record_template: str = \"\") -> Text:\n return super().build(input_value=input_value, record_template=record_template)\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "record_template": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "record_template",
+ "display_name": "Record Template",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
},
- {
- "source": "OpenAIModel-XawYB",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153str\u0153,\u0153Text\u0153,\u0153object\u0153],\u0153dataType\u0153:\u0153OpenAIModel\u0153,\u0153id\u0153:\u0153OpenAIModel-XawYB\u0153}",
- "target": "ChatOutput-DNmvg",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153ChatOutput-DNmvg\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "ChatOutput-DNmvg",
- "inputTypes": [
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "str",
- "Text",
- "object"
- ],
- "dataType": "OpenAIModel",
- "id": "OpenAIModel-XawYB"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-OpenAIModel-XawYB{\u0153baseClasses\u0153:[\u0153str\u0153,\u0153Text\u0153,\u0153object\u0153],\u0153dataType\u0153:\u0153OpenAIModel\u0153,\u0153id\u0153:\u0153OpenAIModel-XawYB\u0153}-ChatOutput-DNmvg{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153ChatOutput-DNmvg\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
- }
- ],
- "viewport": {
- "x": -383.7251879618552,
- "y": 69.19813933800037,
- "zoom": 0.3105753483695743
+ "description": "Display a text output in the Playground.",
+ "icon": "type",
+ "base_classes": ["str", "Text", "object"],
+ "display_name": "Second Prompt",
+ "documentation": "",
+ "custom_fields": {
+ "input_value": null,
+ "record_template": null
+ },
+ "output_types": ["Text"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "TextOutput-MUDOR"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 290,
+ "dragging": false,
+ "positionAbsolute": {
+ "x": 4446.064323520379,
+ "y": 633.833297518702
}
- },
- "description": "The Prompt Chaining flow chains prompts with LLMs, refining outputs through iterative stages.",
- "name": "Prompt Chaining",
- "last_tested_version": "1.0.0a0",
- "is_component": false
+ },
+ {
+ "id": "OpenAIModel-XawYB",
+ "type": "genericNode",
+ "position": {
+ "x": 4500.152018344182,
+ "y": 1027.7382026227656
+ },
+ "data": {
+ "type": "OpenAIModel",
+ "node": {
+ "template": {
+ "input_value": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Input",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional\n\nfrom langchain_openai import ChatOpenAI\nfrom pydantic.v1 import SecretStr\n\nfrom langflow.base.constants import STREAM_INFO_TEXT\nfrom langflow.base.models.model import LCModelComponent\nfrom langflow.base.models.openai_constants import MODEL_NAMES\nfrom langflow.field_typing import NestedDict, Text\n\n\nclass OpenAIModelComponent(LCModelComponent):\n display_name = \"OpenAI\"\n description = \"Generates text using OpenAI LLMs.\"\n icon = \"OpenAI\"\n\n field_order = [\n \"max_tokens\",\n \"model_kwargs\",\n \"model_name\",\n \"openai_api_base\",\n \"openai_api_key\",\n \"temperature\",\n \"input_value\",\n \"system_message\",\n \"stream\",\n ]\n\n def build_config(self):\n return {\n \"input_value\": {\"display_name\": \"Input\"},\n \"max_tokens\": {\n \"display_name\": \"Max Tokens\",\n \"advanced\": True,\n \"info\": \"The maximum number of tokens to generate. Set to 0 for unlimited tokens.\",\n },\n \"model_kwargs\": {\n \"display_name\": \"Model Kwargs\",\n \"advanced\": True,\n },\n \"model_name\": {\n \"display_name\": \"Model Name\",\n \"advanced\": False,\n \"options\": MODEL_NAMES,\n },\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"advanced\": True,\n \"info\": (\n \"The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\\n\\n\"\n \"You can change this to use other APIs like JinaChat, LocalAI and Prem.\"\n ),\n },\n \"openai_api_key\": {\n \"display_name\": \"OpenAI API Key\",\n \"info\": \"The OpenAI API Key to use for the OpenAI model.\",\n \"advanced\": False,\n \"password\": True,\n },\n \"temperature\": {\n \"display_name\": \"Temperature\",\n \"advanced\": False,\n \"value\": 0.1,\n },\n \"stream\": {\n \"display_name\": \"Stream\",\n \"info\": STREAM_INFO_TEXT,\n \"advanced\": True,\n },\n \"system_message\": {\n \"display_name\": \"System Message\",\n \"info\": \"System message to pass to the model.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n input_value: Text,\n openai_api_key: str,\n temperature: float = 0.1,\n model_name: str = \"gpt-4o\",\n max_tokens: Optional[int] = 256,\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n stream: bool = False,\n system_message: Optional[str] = None,\n ) -> Text:\n if not openai_api_base:\n openai_api_base = \"https://api.openai.com/v1\"\n if openai_api_key:\n api_key = SecretStr(openai_api_key)\n else:\n api_key = None\n\n output = ChatOpenAI(\n max_tokens=max_tokens or None,\n model_kwargs=model_kwargs,\n model=model_name,\n base_url=openai_api_base,\n api_key=api_key,\n temperature=temperature,\n )\n\n return self.get_chat_result(output, stream, input_value, system_message)\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "max_tokens": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 256,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "max_tokens",
+ "display_name": "Max Tokens",
+ "advanced": true,
+ "dynamic": false,
+ "info": "The maximum number of tokens to generate. Set to 0 for unlimited tokens.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "model_kwargs": {
+ "type": "NestedDict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": {},
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "model_kwargs",
+ "display_name": "Model Kwargs",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "model_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "gpt-4-turbo-preview",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": [
+ "gpt-4o",
+ "gpt-4-turbo",
+ "gpt-4-turbo-preview",
+ "gpt-3.5-turbo",
+ "gpt-3.5-turbo-0125"
+ ],
+ "name": "model_name",
+ "display_name": "Model Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_base": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "openai_api_base",
+ "display_name": "OpenAI API Base",
+ "advanced": true,
+ "dynamic": false,
+ "info": "The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\n\nYou can change this to use other APIs like JinaChat, LocalAI and Prem.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_key": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "openai_api_key",
+ "display_name": "OpenAI API Key",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The OpenAI API Key to use for the OpenAI model.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": ""
+ },
+ "stream": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "stream",
+ "display_name": "Stream",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Stream the response from the model. Streaming works only in Chat.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "system_message": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "system_message",
+ "display_name": "System Message",
+ "advanced": true,
+ "dynamic": false,
+ "info": "System message to pass to the model.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "temperature": {
+ "type": "float",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 0.1,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "temperature",
+ "display_name": "Temperature",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "rangeSpec": {
+ "step_type": "float",
+ "min": -1,
+ "max": 1,
+ "step": 0.1
+ },
+ "load_from_db": false,
+ "title_case": false
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Generates text using OpenAI LLMs.",
+ "icon": "OpenAI",
+ "base_classes": ["str", "Text", "object"],
+ "display_name": "OpenAI",
+ "documentation": "",
+ "custom_fields": {
+ "input_value": null,
+ "openai_api_key": null,
+ "temperature": null,
+ "model_name": null,
+ "max_tokens": null,
+ "model_kwargs": null,
+ "openai_api_base": null,
+ "stream": null,
+ "system_message": null
+ },
+ "output_types": ["Text"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [
+ "max_tokens",
+ "model_kwargs",
+ "model_name",
+ "openai_api_base",
+ "openai_api_key",
+ "temperature",
+ "input_value",
+ "system_message",
+ "stream"
+ ],
+ "beta": false
+ },
+ "id": "OpenAIModel-XawYB"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 565,
+ "positionAbsolute": {
+ "x": 4500.152018344182,
+ "y": 1027.7382026227656
+ },
+ "dragging": false
+ }
+ ],
+ "edges": [
+ {
+ "source": "TextInput-sptaH",
+ "sourceHandle": "{œbaseClassesœ:[œstrœ,œTextœ,œobjectœ],œdataTypeœ:œTextInputœ,œidœ:œTextInput-sptaHœ}",
+ "target": "Prompt-amqBu",
+ "targetHandle": "{œfieldNameœ:œdocumentœ,œidœ:œPrompt-amqBuœ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "document",
+ "id": "Prompt-amqBu",
+ "inputTypes": ["Document", "BaseOutputParser", "Record", "Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["str", "Text", "object"],
+ "dataType": "TextInput",
+ "id": "TextInput-sptaH"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-TextInput-sptaH{œbaseClassesœ:[œstrœ,œTextœ,œobjectœ],œdataTypeœ:œTextInputœ,œidœ:œTextInput-sptaHœ}-Prompt-amqBu{œfieldNameœ:œdocumentœ,œidœ:œPrompt-amqBuœ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}"
+ },
+ {
+ "source": "Prompt-amqBu",
+ "sourceHandle": "{œbaseClassesœ:[œobjectœ,œstrœ,œTextœ],œdataTypeœ:œPromptœ,œidœ:œPrompt-amqBuœ}",
+ "target": "TextOutput-2MS4a",
+ "targetHandle": "{œfieldNameœ:œinput_valueœ,œidœ:œTextOutput-2MS4aœ,œinputTypesœ:[œRecordœ,œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "TextOutput-2MS4a",
+ "inputTypes": ["Record", "Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["object", "str", "Text"],
+ "dataType": "Prompt",
+ "id": "Prompt-amqBu"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-Prompt-amqBu{œbaseClassesœ:[œobjectœ,œstrœ,œTextœ],œdataTypeœ:œPromptœ,œidœ:œPrompt-amqBuœ}-TextOutput-2MS4a{œfieldNameœ:œinput_valueœ,œidœ:œTextOutput-2MS4aœ,œinputTypesœ:[œRecordœ,œTextœ],œtypeœ:œstrœ}"
+ },
+ {
+ "source": "Prompt-amqBu",
+ "sourceHandle": "{œbaseClassesœ:[œobjectœ,œstrœ,œTextœ],œdataTypeœ:œPromptœ,œidœ:œPrompt-amqBuœ}",
+ "target": "OpenAIModel-uYXZJ",
+ "targetHandle": "{œfieldNameœ:œinput_valueœ,œidœ:œOpenAIModel-uYXZJœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "OpenAIModel-uYXZJ",
+ "inputTypes": ["Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["object", "str", "Text"],
+ "dataType": "Prompt",
+ "id": "Prompt-amqBu"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-Prompt-amqBu{œbaseClassesœ:[œobjectœ,œstrœ,œTextœ],œdataTypeœ:œPromptœ,œidœ:œPrompt-amqBuœ}-OpenAIModel-uYXZJ{œfieldNameœ:œinput_valueœ,œidœ:œOpenAIModel-uYXZJœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}"
+ },
+ {
+ "source": "OpenAIModel-uYXZJ",
+ "sourceHandle": "{œbaseClassesœ:[œstrœ,œTextœ,œobjectœ],œdataTypeœ:œOpenAIModelœ,œidœ:œOpenAIModel-uYXZJœ}",
+ "target": "Prompt-gTNiz",
+ "targetHandle": "{œfieldNameœ:œsummaryœ,œidœ:œPrompt-gTNizœ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "summary",
+ "id": "Prompt-gTNiz",
+ "inputTypes": ["Document", "BaseOutputParser", "Record", "Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["str", "Text", "object"],
+ "dataType": "OpenAIModel",
+ "id": "OpenAIModel-uYXZJ"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-OpenAIModel-uYXZJ{œbaseClassesœ:[œstrœ,œTextœ,œobjectœ],œdataTypeœ:œOpenAIModelœ,œidœ:œOpenAIModel-uYXZJœ}-Prompt-gTNiz{œfieldNameœ:œsummaryœ,œidœ:œPrompt-gTNizœ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}"
+ },
+ {
+ "source": "OpenAIModel-uYXZJ",
+ "sourceHandle": "{œbaseClassesœ:[œstrœ,œTextœ,œobjectœ],œdataTypeœ:œOpenAIModelœ,œidœ:œOpenAIModel-uYXZJœ}",
+ "target": "ChatOutput-EJkG3",
+ "targetHandle": "{œfieldNameœ:œinput_valueœ,œidœ:œChatOutput-EJkG3œ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "ChatOutput-EJkG3",
+ "inputTypes": ["Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["str", "Text", "object"],
+ "dataType": "OpenAIModel",
+ "id": "OpenAIModel-uYXZJ"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-OpenAIModel-uYXZJ{œbaseClassesœ:[œstrœ,œTextœ,œobjectœ],œdataTypeœ:œOpenAIModelœ,œidœ:œOpenAIModel-uYXZJœ}-ChatOutput-EJkG3{œfieldNameœ:œinput_valueœ,œidœ:œChatOutput-EJkG3œ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}"
+ },
+ {
+ "source": "Prompt-gTNiz",
+ "sourceHandle": "{œbaseClassesœ:[œobjectœ,œstrœ,œTextœ],œdataTypeœ:œPromptœ,œidœ:œPrompt-gTNizœ}",
+ "target": "TextOutput-MUDOR",
+ "targetHandle": "{œfieldNameœ:œinput_valueœ,œidœ:œTextOutput-MUDORœ,œinputTypesœ:[œRecordœ,œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "TextOutput-MUDOR",
+ "inputTypes": ["Record", "Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["object", "str", "Text"],
+ "dataType": "Prompt",
+ "id": "Prompt-gTNiz"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-Prompt-gTNiz{œbaseClassesœ:[œobjectœ,œstrœ,œTextœ],œdataTypeœ:œPromptœ,œidœ:œPrompt-gTNizœ}-TextOutput-MUDOR{œfieldNameœ:œinput_valueœ,œidœ:œTextOutput-MUDORœ,œinputTypesœ:[œRecordœ,œTextœ],œtypeœ:œstrœ}"
+ },
+ {
+ "source": "Prompt-gTNiz",
+ "sourceHandle": "{œbaseClassesœ:[œobjectœ,œstrœ,œTextœ],œdataTypeœ:œPromptœ,œidœ:œPrompt-gTNizœ}",
+ "target": "OpenAIModel-XawYB",
+ "targetHandle": "{œfieldNameœ:œinput_valueœ,œidœ:œOpenAIModel-XawYBœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "OpenAIModel-XawYB",
+ "inputTypes": ["Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["object", "str", "Text"],
+ "dataType": "Prompt",
+ "id": "Prompt-gTNiz"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-Prompt-gTNiz{œbaseClassesœ:[œobjectœ,œstrœ,œTextœ],œdataTypeœ:œPromptœ,œidœ:œPrompt-gTNizœ}-OpenAIModel-XawYB{œfieldNameœ:œinput_valueœ,œidœ:œOpenAIModel-XawYBœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}"
+ },
+ {
+ "source": "OpenAIModel-XawYB",
+ "sourceHandle": "{œbaseClassesœ:[œstrœ,œTextœ,œobjectœ],œdataTypeœ:œOpenAIModelœ,œidœ:œOpenAIModel-XawYBœ}",
+ "target": "ChatOutput-DNmvg",
+ "targetHandle": "{œfieldNameœ:œinput_valueœ,œidœ:œChatOutput-DNmvgœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "ChatOutput-DNmvg",
+ "inputTypes": ["Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["str", "Text", "object"],
+ "dataType": "OpenAIModel",
+ "id": "OpenAIModel-XawYB"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-OpenAIModel-XawYB{œbaseClassesœ:[œstrœ,œTextœ,œobjectœ],œdataTypeœ:œOpenAIModelœ,œidœ:œOpenAIModel-XawYBœ}-ChatOutput-DNmvg{œfieldNameœ:œinput_valueœ,œidœ:œChatOutput-DNmvgœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}"
+ }
+ ],
+ "viewport": {
+ "x": -383.7251879618552,
+ "y": 69.19813933800037,
+ "zoom": 0.3105753483695743
+ }
+ },
+ "description": "The Prompt Chaining flow chains prompts with LLMs, refining outputs through iterative stages.",
+ "name": "Prompt Chaining",
+ "last_tested_version": "1.0.0a0",
+ "is_component": false
}
diff --git a/src/backend/base/langflow/initial_setup/starter_projects/VectorStore-RAG-Flows.json b/src/backend/base/langflow/initial_setup/starter_projects/VectorStore-RAG-Flows.json
index 489d1cc19..bbf59c8b1 100644
--- a/src/backend/base/langflow/initial_setup/starter_projects/VectorStore-RAG-Flows.json
+++ b/src/backend/base/langflow/initial_setup/starter_projects/VectorStore-RAG-Flows.json
@@ -1,3407 +1,3151 @@
{
- "id": "51e2b78a-199b-4054-9f32-e288eef6924c",
- "data": {
- "nodes": [
- {
- "id": "ChatInput-yxMKE",
- "type": "genericNode",
- "position": {
- "x": 1195.5276981160775,
- "y": 209.421875
- },
- "data": {
- "type": "ChatInput",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n\n def build_config(self):\n build_config = super().build_config()\n build_config[\"input_value\"] = {\n \"input_types\": [],\n \"display_name\": \"Message\",\n \"multiline\": True,\n }\n\n return build_config\n\n def build(\n self,\n sender: Optional[str] = \"User\",\n sender_name: Optional[str] = \"User\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n ) -> Union[Text, Record]:\n return super().build_no_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n )\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "input_value": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Message",
- "advanced": false,
- "input_types": [],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "value": "what is a line"
- },
- "return_record": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "return_record",
- "display_name": "Return Record",
- "advanced": true,
- "dynamic": false,
- "info": "Return the message as a record containing the sender, sender_name, and session_id.",
- "load_from_db": false,
- "title_case": false
- },
- "sender": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "User",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "Machine",
- "User"
- ],
- "name": "sender",
- "display_name": "Sender Type",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "sender_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "User",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "sender_name",
- "display_name": "Sender Name",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "session_id": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "session_id",
- "display_name": "Session ID",
- "advanced": true,
- "dynamic": false,
- "info": "If provided, the message will be stored in the memory.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Get chat inputs from the Playground.",
- "icon": "ChatInput",
- "base_classes": [
- "Text",
- "str",
- "object",
- "Record"
- ],
- "display_name": "Chat Input",
- "documentation": "",
- "custom_fields": {
- "sender": null,
- "sender_name": null,
- "input_value": null,
- "session_id": null,
- "return_record": null
- },
- "output_types": [
- "Text",
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "ChatInput-yxMKE"
- },
- "selected": false,
- "width": 384,
- "height": 383
+ "id": "51e2b78a-199b-4054-9f32-e288eef6924c",
+ "data": {
+ "nodes": [
+ {
+ "id": "ChatInput-yxMKE",
+ "type": "genericNode",
+ "position": {
+ "x": 1195.5276981160775,
+ "y": 209.421875
+ },
+ "data": {
+ "type": "ChatInput",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n\n def build_config(self):\n build_config = super().build_config()\n build_config[\"input_value\"] = {\n \"input_types\": [],\n \"display_name\": \"Message\",\n \"multiline\": True,\n }\n\n return build_config\n\n def build(\n self,\n sender: Optional[str] = \"User\",\n sender_name: Optional[str] = \"User\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n ) -> Union[Text, Record]:\n return super().build_no_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n )\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "input_value": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Message",
+ "advanced": false,
+ "input_types": [],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "value": "what is a line"
+ },
+ "return_record": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "return_record",
+ "display_name": "Return Record",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Return the message as a record containing the sender, sender_name, and session_id.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "sender": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "User",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["Machine", "User"],
+ "name": "sender",
+ "display_name": "Sender Type",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "sender_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "User",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "sender_name",
+ "display_name": "Sender Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "session_id": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "session_id",
+ "display_name": "Session ID",
+ "advanced": true,
+ "dynamic": false,
+ "info": "If provided, the message will be stored in the memory.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
},
- {
- "id": "TextOutput-BDknO",
- "type": "genericNode",
- "position": {
- "x": 2322.600672827879,
- "y": 604.9467307442569
- },
- "data": {
- "type": "TextOutput",
- "node": {
- "template": {
- "input_value": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Value",
- "advanced": false,
- "input_types": [
- "Record",
- "Text"
- ],
- "dynamic": false,
- "info": "Text or Record to be passed as output.",
- "load_from_db": false,
- "title_case": false
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional\n\nfrom langflow.base.io.text import TextComponent\nfrom langflow.field_typing import Text\n\n\nclass TextOutput(TextComponent):\n display_name = \"Text Output\"\n description = \"Display a text output in the Playground.\"\n icon = \"type\"\n\n def build_config(self):\n return {\n \"input_value\": {\n \"display_name\": \"Value\",\n \"input_types\": [\"Record\", \"Text\"],\n \"info\": \"Text or Record to be passed as output.\",\n },\n \"record_template\": {\n \"display_name\": \"Record Template\",\n \"multiline\": True,\n \"info\": \"Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.\",\n \"advanced\": True,\n },\n }\n\n def build(self, input_value: Optional[Text] = \"\", record_template: str = \"\") -> Text:\n return super().build(input_value=input_value, record_template=record_template)\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "record_template": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "{text}",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "record_template",
- "display_name": "Record Template",
- "advanced": true,
- "dynamic": false,
- "info": "Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Display a text output in the Playground.",
- "icon": "type",
- "base_classes": [
- "object",
- "Text",
- "str"
- ],
- "display_name": "Extracted Chunks",
- "documentation": "",
- "custom_fields": {
- "input_value": null,
- "record_template": null
- },
- "output_types": [
- "Text"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "TextOutput-BDknO"
- },
- "selected": false,
- "width": 384,
- "height": 289,
- "positionAbsolute": {
- "x": 2322.600672827879,
- "y": 604.9467307442569
- },
- "dragging": false
+ "description": "Get chat inputs from the Playground.",
+ "icon": "ChatInput",
+ "base_classes": ["Text", "str", "object", "Record"],
+ "display_name": "Chat Input",
+ "documentation": "",
+ "custom_fields": {
+ "sender": null,
+ "sender_name": null,
+ "input_value": null,
+ "session_id": null,
+ "return_record": null
},
- {
- "id": "OpenAIEmbeddings-ZlOk1",
- "type": "genericNode",
- "position": {
- "x": 1183.667250865064,
- "y": 687.3171828430261
- },
- "data": {
- "type": "OpenAIEmbeddings",
- "node": {
- "template": {
- "allowed_special": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": [],
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "allowed_special",
- "display_name": "Allowed Special",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "chunk_size": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 1000,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "chunk_size",
- "display_name": "Chunk Size",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "client": {
- "type": "Any",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "client",
- "display_name": "Client",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Dict, List, Optional\n\nfrom langchain_openai.embeddings.base import OpenAIEmbeddings\nfrom pydantic.v1 import SecretStr\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Embeddings, NestedDict\n\n\nclass OpenAIEmbeddingsComponent(CustomComponent):\n display_name = \"OpenAI Embeddings\"\n description = \"Generate embeddings using OpenAI models.\"\n\n def build_config(self):\n return {\n \"allowed_special\": {\n \"display_name\": \"Allowed Special\",\n \"advanced\": True,\n \"field_type\": \"str\",\n \"is_list\": True,\n },\n \"default_headers\": {\n \"display_name\": \"Default Headers\",\n \"advanced\": True,\n \"field_type\": \"dict\",\n },\n \"default_query\": {\n \"display_name\": \"Default Query\",\n \"advanced\": True,\n \"field_type\": \"NestedDict\",\n },\n \"disallowed_special\": {\n \"display_name\": \"Disallowed Special\",\n \"advanced\": True,\n \"field_type\": \"str\",\n \"is_list\": True,\n },\n \"chunk_size\": {\"display_name\": \"Chunk Size\", \"advanced\": True},\n \"client\": {\"display_name\": \"Client\", \"advanced\": True},\n \"deployment\": {\"display_name\": \"Deployment\", \"advanced\": True},\n \"embedding_ctx_length\": {\n \"display_name\": \"Embedding Context Length\",\n \"advanced\": True,\n },\n \"max_retries\": {\"display_name\": \"Max Retries\", \"advanced\": True},\n \"model\": {\n \"display_name\": \"Model\",\n \"advanced\": False,\n \"options\": [\n \"text-embedding-3-small\",\n \"text-embedding-3-large\",\n \"text-embedding-ada-002\",\n ],\n },\n \"model_kwargs\": {\"display_name\": \"Model Kwargs\", \"advanced\": True},\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"password\": True,\n \"advanced\": True,\n },\n \"openai_api_key\": {\"display_name\": \"OpenAI API Key\", \"password\": True},\n \"openai_api_type\": {\n \"display_name\": \"OpenAI API Type\",\n \"advanced\": True,\n \"password\": True,\n },\n \"openai_api_version\": {\n \"display_name\": \"OpenAI API Version\",\n \"advanced\": True,\n },\n \"openai_organization\": {\n \"display_name\": \"OpenAI Organization\",\n \"advanced\": True,\n },\n \"openai_proxy\": {\"display_name\": \"OpenAI Proxy\", \"advanced\": True},\n \"request_timeout\": {\"display_name\": \"Request Timeout\", \"advanced\": True},\n \"show_progress_bar\": {\n \"display_name\": \"Show Progress Bar\",\n \"advanced\": True,\n },\n \"skip_empty\": {\"display_name\": \"Skip Empty\", \"advanced\": True},\n \"tiktoken_model_name\": {\n \"display_name\": \"TikToken Model Name\",\n \"advanced\": True,\n },\n \"tiktoken_enable\": {\"display_name\": \"TikToken Enable\", \"advanced\": True},\n }\n\n def build(\n self,\n openai_api_key: str,\n default_headers: Optional[Dict[str, str]] = None,\n default_query: Optional[NestedDict] = {},\n allowed_special: List[str] = [],\n disallowed_special: List[str] = [\"all\"],\n chunk_size: int = 1000,\n deployment: str = \"text-embedding-ada-002\",\n embedding_ctx_length: int = 8191,\n max_retries: int = 6,\n model: str = \"text-embedding-ada-002\",\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n openai_api_type: Optional[str] = None,\n openai_api_version: Optional[str] = None,\n openai_organization: Optional[str] = None,\n openai_proxy: Optional[str] = None,\n request_timeout: Optional[float] = None,\n show_progress_bar: bool = False,\n skip_empty: bool = False,\n tiktoken_enable: bool = True,\n tiktoken_model_name: Optional[str] = None,\n ) -> Embeddings:\n # This is to avoid errors with Vector Stores (e.g Chroma)\n if disallowed_special == [\"all\"]:\n disallowed_special = \"all\" # type: ignore\n if openai_api_key:\n api_key = SecretStr(openai_api_key)\n else:\n api_key = None\n\n return OpenAIEmbeddings(\n tiktoken_enabled=tiktoken_enable,\n default_headers=default_headers,\n default_query=default_query,\n allowed_special=set(allowed_special),\n disallowed_special=\"all\",\n chunk_size=chunk_size,\n deployment=deployment,\n embedding_ctx_length=embedding_ctx_length,\n max_retries=max_retries,\n model=model,\n model_kwargs=model_kwargs,\n base_url=openai_api_base,\n api_key=api_key,\n openai_api_type=openai_api_type,\n api_version=openai_api_version,\n organization=openai_organization,\n openai_proxy=openai_proxy,\n timeout=request_timeout,\n show_progress_bar=show_progress_bar,\n skip_empty=skip_empty,\n tiktoken_model_name=tiktoken_model_name,\n )\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "default_headers": {
- "type": "dict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "default_headers",
- "display_name": "Default Headers",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "default_query": {
- "type": "NestedDict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": {},
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "default_query",
- "display_name": "Default Query",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "deployment": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "text-embedding-ada-002",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "deployment",
- "display_name": "Deployment",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "disallowed_special": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": [
- "all"
- ],
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "disallowed_special",
- "display_name": "Disallowed Special",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "embedding_ctx_length": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 8191,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "embedding_ctx_length",
- "display_name": "Embedding Context Length",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "max_retries": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 6,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "max_retries",
- "display_name": "Max Retries",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "model": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "text-embedding-ada-002",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "text-embedding-3-small",
- "text-embedding-3-large",
- "text-embedding-ada-002"
- ],
- "name": "model",
- "display_name": "Model",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "model_kwargs": {
- "type": "NestedDict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": {},
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "model_kwargs",
- "display_name": "Model Kwargs",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "openai_api_base": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "openai_api_base",
- "display_name": "OpenAI API Base",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_key": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "openai_api_key",
- "display_name": "OpenAI API Key",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": true,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "OPENAI_API_KEY"
- },
- "openai_api_type": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "openai_api_type",
- "display_name": "OpenAI API Type",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_version": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "openai_api_version",
- "display_name": "OpenAI API Version",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_organization": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "openai_organization",
- "display_name": "OpenAI Organization",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_proxy": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "openai_proxy",
- "display_name": "OpenAI Proxy",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "request_timeout": {
- "type": "float",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "request_timeout",
- "display_name": "Request Timeout",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "rangeSpec": {
- "step_type": "float",
- "min": -1,
- "max": 1,
- "step": 0.1
- },
- "load_from_db": false,
- "title_case": false
- },
- "show_progress_bar": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "show_progress_bar",
- "display_name": "Show Progress Bar",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "skip_empty": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "skip_empty",
- "display_name": "Skip Empty",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "tiktoken_enable": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": true,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "tiktoken_enable",
- "display_name": "TikToken Enable",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "tiktoken_model_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "tiktoken_model_name",
- "display_name": "TikToken Model Name",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Generate embeddings using OpenAI models.",
- "base_classes": [
- "Embeddings"
- ],
- "display_name": "OpenAI Embeddings",
- "documentation": "",
- "custom_fields": {
- "openai_api_key": null,
- "default_headers": null,
- "default_query": null,
- "allowed_special": null,
- "disallowed_special": null,
- "chunk_size": null,
- "client": null,
- "deployment": null,
- "embedding_ctx_length": null,
- "max_retries": null,
- "model": null,
- "model_kwargs": null,
- "openai_api_base": null,
- "openai_api_type": null,
- "openai_api_version": null,
- "openai_organization": null,
- "openai_proxy": null,
- "request_timeout": null,
- "show_progress_bar": null,
- "skip_empty": null,
- "tiktoken_enable": null,
- "tiktoken_model_name": null
- },
- "output_types": [
- "Embeddings"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "OpenAIEmbeddings-ZlOk1"
- },
- "selected": false,
- "width": 384,
- "height": 383,
- "dragging": false
+ "output_types": ["Text", "Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "ChatInput-yxMKE"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 383
+ },
+ {
+ "id": "TextOutput-BDknO",
+ "type": "genericNode",
+ "position": {
+ "x": 2322.600672827879,
+ "y": 604.9467307442569
+ },
+ "data": {
+ "type": "TextOutput",
+ "node": {
+ "template": {
+ "input_value": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Value",
+ "advanced": false,
+ "input_types": ["Record", "Text"],
+ "dynamic": false,
+ "info": "Text or Record to be passed as output.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional\n\nfrom langflow.base.io.text import TextComponent\nfrom langflow.field_typing import Text\n\n\nclass TextOutput(TextComponent):\n display_name = \"Text Output\"\n description = \"Display a text output in the Playground.\"\n icon = \"type\"\n\n def build_config(self):\n return {\n \"input_value\": {\n \"display_name\": \"Value\",\n \"input_types\": [\"Record\", \"Text\"],\n \"info\": \"Text or Record to be passed as output.\",\n },\n \"record_template\": {\n \"display_name\": \"Record Template\",\n \"multiline\": True,\n \"info\": \"Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.\",\n \"advanced\": True,\n },\n }\n\n def build(self, input_value: Optional[Text] = \"\", record_template: str = \"\") -> Text:\n return super().build(input_value=input_value, record_template=record_template)\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "record_template": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "{text}",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "record_template",
+ "display_name": "Record Template",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Template to convert Record to Text. If left empty, it will be dynamically set to the Record's text key.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
},
- {
- "id": "OpenAIModel-EjXlN",
- "type": "genericNode",
- "position": {
- "x": 3410.117202077183,
- "y": 431.2038048137648
- },
- "data": {
- "type": "OpenAIModel",
- "node": {
- "template": {
- "input_value": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Input",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional\n\nfrom langchain_openai import ChatOpenAI\nfrom pydantic.v1 import SecretStr\n\nfrom langflow.base.constants import STREAM_INFO_TEXT\nfrom langflow.base.models.model import LCModelComponent\nfrom langflow.base.models.openai_constants import MODEL_NAMES\nfrom langflow.field_typing import NestedDict, Text\n\n\nclass OpenAIModelComponent(LCModelComponent):\n display_name = \"OpenAI\"\n description = \"Generates text using OpenAI LLMs.\"\n icon = \"OpenAI\"\n\n field_order = [\n \"max_tokens\",\n \"model_kwargs\",\n \"model_name\",\n \"openai_api_base\",\n \"openai_api_key\",\n \"temperature\",\n \"input_value\",\n \"system_message\",\n \"stream\",\n ]\n\n def build_config(self):\n return {\n \"input_value\": {\"display_name\": \"Input\"},\n \"max_tokens\": {\n \"display_name\": \"Max Tokens\",\n \"advanced\": True,\n \"info\": \"The maximum number of tokens to generate. Set to 0 for unlimited tokens.\",\n },\n \"model_kwargs\": {\n \"display_name\": \"Model Kwargs\",\n \"advanced\": True,\n },\n \"model_name\": {\n \"display_name\": \"Model Name\",\n \"advanced\": False,\n \"options\": MODEL_NAMES,\n },\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"advanced\": True,\n \"info\": (\n \"The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\\n\\n\"\n \"You can change this to use other APIs like JinaChat, LocalAI and Prem.\"\n ),\n },\n \"openai_api_key\": {\n \"display_name\": \"OpenAI API Key\",\n \"info\": \"The OpenAI API Key to use for the OpenAI model.\",\n \"advanced\": False,\n \"password\": True,\n },\n \"temperature\": {\n \"display_name\": \"Temperature\",\n \"advanced\": False,\n \"value\": 0.1,\n },\n \"stream\": {\n \"display_name\": \"Stream\",\n \"info\": STREAM_INFO_TEXT,\n \"advanced\": True,\n },\n \"system_message\": {\n \"display_name\": \"System Message\",\n \"info\": \"System message to pass to the model.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n input_value: Text,\n openai_api_key: str,\n temperature: float,\n model_name: str = \"gpt-4o\",\n max_tokens: Optional[int] = 256,\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n stream: bool = False,\n system_message: Optional[str] = None,\n ) -> Text:\n if not openai_api_base:\n openai_api_base = \"https://api.openai.com/v1\"\n if openai_api_key:\n api_key = SecretStr(openai_api_key)\n else:\n api_key = None\n\n output = ChatOpenAI(\n max_tokens=max_tokens or None,\n model_kwargs=model_kwargs,\n model=model_name,\n base_url=openai_api_base,\n api_key=api_key,\n temperature=temperature,\n )\n\n return self.get_chat_result(output, stream, input_value, system_message)\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "max_tokens": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 256,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "max_tokens",
- "display_name": "Max Tokens",
- "advanced": true,
- "dynamic": false,
- "info": "The maximum number of tokens to generate. Set to 0 for unlimited tokens.",
- "load_from_db": false,
- "title_case": false
- },
- "model_kwargs": {
- "type": "NestedDict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": {},
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "model_kwargs",
- "display_name": "Model Kwargs",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "model_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "gpt-3.5-turbo",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "gpt-4o",
- "gpt-4-turbo",
- "gpt-4-turbo-preview",
- "gpt-3.5-turbo",
- "gpt-3.5-turbo-0125"
- ],
- "name": "model_name",
- "display_name": "Model Name",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_base": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "openai_api_base",
- "display_name": "OpenAI API Base",
- "advanced": true,
- "dynamic": false,
- "info": "The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\n\nYou can change this to use other APIs like JinaChat, LocalAI and Prem.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_key": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "openai_api_key",
- "display_name": "OpenAI API Key",
- "advanced": false,
- "dynamic": false,
- "info": "The OpenAI API Key to use for the OpenAI model.",
- "load_from_db": true,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "OPENAI_API_KEY"
- },
- "stream": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "stream",
- "display_name": "Stream",
- "advanced": true,
- "dynamic": false,
- "info": "Stream the response from the model. Streaming works only in Chat.",
- "load_from_db": false,
- "title_case": false
- },
- "system_message": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "system_message",
- "display_name": "System Message",
- "advanced": true,
- "dynamic": false,
- "info": "System message to pass to the model.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "temperature": {
- "type": "float",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 0.1,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "temperature",
- "display_name": "Temperature",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "rangeSpec": {
- "step_type": "float",
- "min": -1,
- "max": 1,
- "step": 0.1
- },
- "load_from_db": false,
- "title_case": false
- },
- "_type": "CustomComponent"
- },
- "description": "Generates text using OpenAI LLMs.",
- "icon": "OpenAI",
- "base_classes": [
- "object",
- "Text",
- "str"
- ],
- "display_name": "OpenAI",
- "documentation": "",
- "custom_fields": {
- "input_value": null,
- "openai_api_key": null,
- "temperature": null,
- "model_name": null,
- "max_tokens": null,
- "model_kwargs": null,
- "openai_api_base": null,
- "stream": null,
- "system_message": null
- },
- "output_types": [
- "Text"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [
- "max_tokens",
- "model_kwargs",
- "model_name",
- "openai_api_base",
- "openai_api_key",
- "temperature",
- "input_value",
- "system_message",
- "stream"
- ],
- "beta": false
- },
- "id": "OpenAIModel-EjXlN"
- },
- "selected": true,
- "width": 384,
- "height": 563,
- "positionAbsolute": {
- "x": 3410.117202077183,
- "y": 431.2038048137648
- },
- "dragging": false
+ "description": "Display a text output in the Playground.",
+ "icon": "type",
+ "base_classes": ["object", "Text", "str"],
+ "display_name": "Extracted Chunks",
+ "documentation": "",
+ "custom_fields": {
+ "input_value": null,
+ "record_template": null
},
- {
- "id": "Prompt-xeI6K",
- "type": "genericNode",
- "position": {
- "x": 2969.0261961391298,
- "y": 442.1613649809069
+ "output_types": ["Text"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "TextOutput-BDknO"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 289,
+ "positionAbsolute": {
+ "x": 2322.600672827879,
+ "y": 604.9467307442569
+ },
+ "dragging": false
+ },
+ {
+ "id": "OpenAIEmbeddings-ZlOk1",
+ "type": "genericNode",
+ "position": {
+ "x": 1183.667250865064,
+ "y": 687.3171828430261
+ },
+ "data": {
+ "type": "OpenAIEmbeddings",
+ "node": {
+ "template": {
+ "allowed_special": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": [],
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "allowed_special",
+ "display_name": "Allowed Special",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "chunk_size": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 1000,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "chunk_size",
+ "display_name": "Chunk Size",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "client": {
+ "type": "Any",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "client",
+ "display_name": "Client",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Dict, List, Optional\n\nfrom langchain_openai.embeddings.base import OpenAIEmbeddings\nfrom pydantic.v1 import SecretStr\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Embeddings, NestedDict\n\n\nclass OpenAIEmbeddingsComponent(CustomComponent):\n display_name = \"OpenAI Embeddings\"\n description = \"Generate embeddings using OpenAI models.\"\n\n def build_config(self):\n return {\n \"allowed_special\": {\n \"display_name\": \"Allowed Special\",\n \"advanced\": True,\n \"field_type\": \"str\",\n \"is_list\": True,\n },\n \"default_headers\": {\n \"display_name\": \"Default Headers\",\n \"advanced\": True,\n \"field_type\": \"dict\",\n },\n \"default_query\": {\n \"display_name\": \"Default Query\",\n \"advanced\": True,\n \"field_type\": \"NestedDict\",\n },\n \"disallowed_special\": {\n \"display_name\": \"Disallowed Special\",\n \"advanced\": True,\n \"field_type\": \"str\",\n \"is_list\": True,\n },\n \"chunk_size\": {\"display_name\": \"Chunk Size\", \"advanced\": True},\n \"client\": {\"display_name\": \"Client\", \"advanced\": True},\n \"deployment\": {\"display_name\": \"Deployment\", \"advanced\": True},\n \"embedding_ctx_length\": {\n \"display_name\": \"Embedding Context Length\",\n \"advanced\": True,\n },\n \"max_retries\": {\"display_name\": \"Max Retries\", \"advanced\": True},\n \"model\": {\n \"display_name\": \"Model\",\n \"advanced\": False,\n \"options\": [\n \"text-embedding-3-small\",\n \"text-embedding-3-large\",\n \"text-embedding-ada-002\",\n ],\n },\n \"model_kwargs\": {\"display_name\": \"Model Kwargs\", \"advanced\": True},\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"password\": True,\n \"advanced\": True,\n },\n \"openai_api_key\": {\"display_name\": \"OpenAI API Key\", \"password\": True},\n \"openai_api_type\": {\n \"display_name\": \"OpenAI API Type\",\n \"advanced\": True,\n \"password\": True,\n },\n \"openai_api_version\": {\n \"display_name\": \"OpenAI API Version\",\n \"advanced\": True,\n },\n \"openai_organization\": {\n \"display_name\": \"OpenAI Organization\",\n \"advanced\": True,\n },\n \"openai_proxy\": {\"display_name\": \"OpenAI Proxy\", \"advanced\": True},\n \"request_timeout\": {\"display_name\": \"Request Timeout\", \"advanced\": True},\n \"show_progress_bar\": {\n \"display_name\": \"Show Progress Bar\",\n \"advanced\": True,\n },\n \"skip_empty\": {\"display_name\": \"Skip Empty\", \"advanced\": True},\n \"tiktoken_model_name\": {\n \"display_name\": \"TikToken Model Name\",\n \"advanced\": True,\n },\n \"tiktoken_enable\": {\"display_name\": \"TikToken Enable\", \"advanced\": True},\n }\n\n def build(\n self,\n openai_api_key: str,\n default_headers: Optional[Dict[str, str]] = None,\n default_query: Optional[NestedDict] = {},\n allowed_special: List[str] = [],\n disallowed_special: List[str] = [\"all\"],\n chunk_size: int = 1000,\n deployment: str = \"text-embedding-ada-002\",\n embedding_ctx_length: int = 8191,\n max_retries: int = 6,\n model: str = \"text-embedding-ada-002\",\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n openai_api_type: Optional[str] = None,\n openai_api_version: Optional[str] = None,\n openai_organization: Optional[str] = None,\n openai_proxy: Optional[str] = None,\n request_timeout: Optional[float] = None,\n show_progress_bar: bool = False,\n skip_empty: bool = False,\n tiktoken_enable: bool = True,\n tiktoken_model_name: Optional[str] = None,\n ) -> Embeddings:\n # This is to avoid errors with Vector Stores (e.g Chroma)\n if disallowed_special == [\"all\"]:\n disallowed_special = \"all\" # type: ignore\n if openai_api_key:\n api_key = SecretStr(openai_api_key)\n else:\n api_key = None\n\n return OpenAIEmbeddings(\n tiktoken_enabled=tiktoken_enable,\n default_headers=default_headers,\n default_query=default_query,\n allowed_special=set(allowed_special),\n disallowed_special=\"all\",\n chunk_size=chunk_size,\n deployment=deployment,\n embedding_ctx_length=embedding_ctx_length,\n max_retries=max_retries,\n model=model,\n model_kwargs=model_kwargs,\n base_url=openai_api_base,\n api_key=api_key,\n openai_api_type=openai_api_type,\n api_version=openai_api_version,\n organization=openai_organization,\n openai_proxy=openai_proxy,\n timeout=request_timeout,\n show_progress_bar=show_progress_bar,\n skip_empty=skip_empty,\n tiktoken_model_name=tiktoken_model_name,\n )\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "default_headers": {
+ "type": "dict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "default_headers",
+ "display_name": "Default Headers",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "default_query": {
+ "type": "NestedDict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": {},
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "default_query",
+ "display_name": "Default Query",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "deployment": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "text-embedding-ada-002",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "deployment",
+ "display_name": "Deployment",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "disallowed_special": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": ["all"],
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "disallowed_special",
+ "display_name": "Disallowed Special",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "embedding_ctx_length": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 8191,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "embedding_ctx_length",
+ "display_name": "Embedding Context Length",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "max_retries": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 6,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "max_retries",
+ "display_name": "Max Retries",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "model": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "text-embedding-ada-002",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": [
+ "text-embedding-3-small",
+ "text-embedding-3-large",
+ "text-embedding-ada-002"
+ ],
+ "name": "model",
+ "display_name": "Model",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "model_kwargs": {
+ "type": "NestedDict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": {},
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "model_kwargs",
+ "display_name": "Model Kwargs",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "openai_api_base": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "openai_api_base",
+ "display_name": "OpenAI API Base",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_key": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "openai_api_key",
+ "display_name": "OpenAI API Key",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "OPENAI_API_KEY"
+ },
+ "openai_api_type": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "openai_api_type",
+ "display_name": "OpenAI API Type",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_version": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "openai_api_version",
+ "display_name": "OpenAI API Version",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_organization": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "openai_organization",
+ "display_name": "OpenAI Organization",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_proxy": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "openai_proxy",
+ "display_name": "OpenAI Proxy",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "request_timeout": {
+ "type": "float",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "request_timeout",
+ "display_name": "Request Timeout",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "rangeSpec": {
+ "step_type": "float",
+ "min": -1,
+ "max": 1,
+ "step": 0.1
},
- "data": {
- "type": "Prompt",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from langchain_core.prompts import PromptTemplate\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Prompt, TemplateField, Text\n\n\nclass PromptComponent(CustomComponent):\n display_name: str = \"Prompt\"\n description: str = \"Create a prompt template with dynamic variables.\"\n icon = \"prompts\"\n\n def build_config(self):\n return {\n \"template\": TemplateField(display_name=\"Template\"),\n \"code\": TemplateField(advanced=True),\n }\n\n def build(\n self,\n template: Prompt,\n **kwargs,\n ) -> Text:\n from langflow.base.prompts.utils import dict_values_to_string\n\n prompt_template = PromptTemplate.from_template(Text(template))\n kwargs = dict_values_to_string(kwargs)\n kwargs = {k: \"\\n\".join(v) if isinstance(v, list) else v for k, v in kwargs.items()}\n try:\n formated_prompt = prompt_template.format(**kwargs)\n except Exception as exc:\n raise ValueError(f\"Error formatting prompt: {exc}\") from exc\n self.status = f'Prompt:\\n\"{formated_prompt}\"'\n return formated_prompt\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "template": {
- "type": "prompt",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "{context}\n\n---\n\nGiven the context above, answer the question as best as possible.\n\nQuestion: {question}\n\nAnswer: ",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "template",
- "display_name": "Template",
- "advanced": false,
- "input_types": [
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "_type": "CustomComponent",
- "context": {
- "field_type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "context",
- "display_name": "context",
- "advanced": false,
- "input_types": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "type": "str"
- },
- "question": {
- "field_type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "question",
- "display_name": "question",
- "advanced": false,
- "input_types": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "type": "str"
- }
- },
- "description": "Create a prompt template with dynamic variables.",
- "icon": "prompts",
- "is_input": null,
- "is_output": null,
- "is_composition": null,
- "base_classes": [
- "object",
- "Text",
- "str"
- ],
- "name": "",
- "display_name": "Prompt",
- "documentation": "",
- "custom_fields": {
- "template": [
- "context",
- "question"
- ]
- },
- "output_types": [
- "Text"
- ],
- "full_path": null,
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false,
- "error": null
- },
- "id": "Prompt-xeI6K",
- "description": "Create a prompt template with dynamic variables.",
- "display_name": "Prompt"
- },
- "selected": false,
- "width": 384,
- "height": 477,
- "positionAbsolute": {
- "x": 2969.0261961391298,
- "y": 442.1613649809069
- },
- "dragging": false
+ "load_from_db": false,
+ "title_case": false
+ },
+ "show_progress_bar": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "show_progress_bar",
+ "display_name": "Show Progress Bar",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "skip_empty": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "skip_empty",
+ "display_name": "Skip Empty",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "tiktoken_enable": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "tiktoken_enable",
+ "display_name": "TikToken Enable",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "tiktoken_model_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "tiktoken_model_name",
+ "display_name": "TikToken Model Name",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
},
- {
- "id": "ChatOutput-Q39I8",
- "type": "genericNode",
- "position": {
- "x": 3887.2073667611485,
- "y": 588.4801225794856
- },
- "data": {
- "type": "ChatOutput",
- "node": {
- "template": {
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n\n def build(\n self,\n sender: Optional[str] = \"Machine\",\n sender_name: Optional[str] = \"AI\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n record_template: Optional[str] = \"{text}\",\n ) -> Union[Text, Record]:\n return super().build_with_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n record_template=record_template or \"\",\n )\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "input_value": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Message",
- "advanced": false,
- "input_types": [
- "Text"
- ],
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "record_template": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "{text}",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "record_template",
- "display_name": "Record Template",
- "advanced": true,
- "dynamic": false,
- "info": "In case of Message being a Record, this template will be used to convert it to text.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "return_record": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "return_record",
- "display_name": "Return Record",
- "advanced": true,
- "dynamic": false,
- "info": "Return the message as a record containing the sender, sender_name, and session_id.",
- "load_from_db": false,
- "title_case": false
- },
- "sender": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "Machine",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "Machine",
- "User"
- ],
- "name": "sender",
- "display_name": "Sender Type",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "sender_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "AI",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "sender_name",
- "display_name": "Sender Name",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "session_id": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "session_id",
- "display_name": "Session ID",
- "advanced": true,
- "dynamic": false,
- "info": "If provided, the message will be stored in the memory.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Display a chat message in the Playground.",
- "icon": "ChatOutput",
- "base_classes": [
- "object",
- "Text",
- "Record",
- "str"
- ],
- "display_name": "Chat Output",
- "documentation": "",
- "custom_fields": {
- "sender": null,
- "sender_name": null,
- "input_value": null,
- "session_id": null,
- "return_record": null,
- "record_template": null
- },
- "output_types": [
- "Text",
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "ChatOutput-Q39I8"
- },
- "selected": false,
- "width": 384,
- "height": 383,
- "positionAbsolute": {
- "x": 3887.2073667611485,
- "y": 588.4801225794856
- },
- "dragging": false
+ "description": "Generate embeddings using OpenAI models.",
+ "base_classes": ["Embeddings"],
+ "display_name": "OpenAI Embeddings",
+ "documentation": "",
+ "custom_fields": {
+ "openai_api_key": null,
+ "default_headers": null,
+ "default_query": null,
+ "allowed_special": null,
+ "disallowed_special": null,
+ "chunk_size": null,
+ "client": null,
+ "deployment": null,
+ "embedding_ctx_length": null,
+ "max_retries": null,
+ "model": null,
+ "model_kwargs": null,
+ "openai_api_base": null,
+ "openai_api_type": null,
+ "openai_api_version": null,
+ "openai_organization": null,
+ "openai_proxy": null,
+ "request_timeout": null,
+ "show_progress_bar": null,
+ "skip_empty": null,
+ "tiktoken_enable": null,
+ "tiktoken_model_name": null
},
- {
- "id": "File-t0a6a",
- "type": "genericNode",
- "position": {
- "x": 2257.233450682836,
- "y": 1747.5389618367233
+ "output_types": ["Embeddings"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "OpenAIEmbeddings-ZlOk1"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 383,
+ "dragging": false
+ },
+ {
+ "id": "OpenAIModel-EjXlN",
+ "type": "genericNode",
+ "position": {
+ "x": 3410.117202077183,
+ "y": 431.2038048137648
+ },
+ "data": {
+ "type": "OpenAIModel",
+ "node": {
+ "template": {
+ "input_value": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Input",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional\n\nfrom langchain_openai import ChatOpenAI\nfrom pydantic.v1 import SecretStr\n\nfrom langflow.base.constants import STREAM_INFO_TEXT\nfrom langflow.base.models.model import LCModelComponent\nfrom langflow.base.models.openai_constants import MODEL_NAMES\nfrom langflow.field_typing import NestedDict, Text\n\n\nclass OpenAIModelComponent(LCModelComponent):\n display_name = \"OpenAI\"\n description = \"Generates text using OpenAI LLMs.\"\n icon = \"OpenAI\"\n\n field_order = [\n \"max_tokens\",\n \"model_kwargs\",\n \"model_name\",\n \"openai_api_base\",\n \"openai_api_key\",\n \"temperature\",\n \"input_value\",\n \"system_message\",\n \"stream\",\n ]\n\n def build_config(self):\n return {\n \"input_value\": {\"display_name\": \"Input\"},\n \"max_tokens\": {\n \"display_name\": \"Max Tokens\",\n \"advanced\": True,\n \"info\": \"The maximum number of tokens to generate. Set to 0 for unlimited tokens.\",\n },\n \"model_kwargs\": {\n \"display_name\": \"Model Kwargs\",\n \"advanced\": True,\n },\n \"model_name\": {\n \"display_name\": \"Model Name\",\n \"advanced\": False,\n \"options\": MODEL_NAMES,\n },\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"advanced\": True,\n \"info\": (\n \"The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\\n\\n\"\n \"You can change this to use other APIs like JinaChat, LocalAI and Prem.\"\n ),\n },\n \"openai_api_key\": {\n \"display_name\": \"OpenAI API Key\",\n \"info\": \"The OpenAI API Key to use for the OpenAI model.\",\n \"advanced\": False,\n \"password\": True,\n },\n \"temperature\": {\n \"display_name\": \"Temperature\",\n \"advanced\": False,\n \"value\": 0.1,\n },\n \"stream\": {\n \"display_name\": \"Stream\",\n \"info\": STREAM_INFO_TEXT,\n \"advanced\": True,\n },\n \"system_message\": {\n \"display_name\": \"System Message\",\n \"info\": \"System message to pass to the model.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n input_value: Text,\n openai_api_key: str,\n temperature: float = 0.1,\n model_name: str = \"gpt-4o\",\n max_tokens: Optional[int] = 256,\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n stream: bool = False,\n system_message: Optional[str] = None,\n ) -> Text:\n if not openai_api_base:\n openai_api_base = \"https://api.openai.com/v1\"\n if openai_api_key:\n api_key = SecretStr(openai_api_key)\n else:\n api_key = None\n\n output = ChatOpenAI(\n max_tokens=max_tokens or None,\n model_kwargs=model_kwargs,\n model=model_name,\n base_url=openai_api_base,\n api_key=api_key,\n temperature=temperature,\n )\n\n return self.get_chat_result(output, stream, input_value, system_message)\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "max_tokens": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 256,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "max_tokens",
+ "display_name": "Max Tokens",
+ "advanced": true,
+ "dynamic": false,
+ "info": "The maximum number of tokens to generate. Set to 0 for unlimited tokens.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "model_kwargs": {
+ "type": "NestedDict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": {},
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "model_kwargs",
+ "display_name": "Model Kwargs",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "model_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "gpt-3.5-turbo",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": [
+ "gpt-4o",
+ "gpt-4-turbo",
+ "gpt-4-turbo-preview",
+ "gpt-3.5-turbo",
+ "gpt-3.5-turbo-0125"
+ ],
+ "name": "model_name",
+ "display_name": "Model Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_base": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "openai_api_base",
+ "display_name": "OpenAI API Base",
+ "advanced": true,
+ "dynamic": false,
+ "info": "The base URL of the OpenAI API. Defaults to https://api.openai.com/v1.\n\nYou can change this to use other APIs like JinaChat, LocalAI and Prem.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_key": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "openai_api_key",
+ "display_name": "OpenAI API Key",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The OpenAI API Key to use for the OpenAI model.",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "OPENAI_API_KEY"
+ },
+ "stream": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "stream",
+ "display_name": "Stream",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Stream the response from the model. Streaming works only in Chat.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "system_message": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "system_message",
+ "display_name": "System Message",
+ "advanced": true,
+ "dynamic": false,
+ "info": "System message to pass to the model.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "temperature": {
+ "type": "float",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 0.1,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "temperature",
+ "display_name": "Temperature",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "rangeSpec": {
+ "step_type": "float",
+ "min": -1,
+ "max": 1,
+ "step": 0.1
},
- "data": {
- "type": "File",
- "node": {
- "template": {
- "path": {
- "type": "file",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [
- ".txt",
- ".md",
- ".mdx",
- ".csv",
- ".json",
- ".yaml",
- ".yml",
- ".xml",
- ".html",
- ".htm",
- ".pdf",
- ".docx",
- ".py",
- ".sh",
- ".sql",
- ".js",
- ".ts",
- ".tsx"
- ],
- "file_path": "51e2b78a-199b-4054-9f32-e288eef6924c/Langflow conversation.pdf",
- "password": false,
- "name": "path",
- "display_name": "Path",
- "advanced": false,
- "dynamic": false,
- "info": "Supported file types: txt, md, mdx, csv, json, yaml, yml, xml, html, htm, pdf, docx, py, sh, sql, js, ts, tsx",
- "load_from_db": false,
- "title_case": false,
- "value": ""
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from pathlib import Path\nfrom typing import Any, Dict\n\nfrom langflow.base.data.utils import TEXT_FILE_TYPES, parse_text_file_to_record\nfrom langflow.custom import CustomComponent\nfrom langflow.schema import Record\n\n\nclass FileComponent(CustomComponent):\n display_name = \"File\"\n description = \"A generic file loader.\"\n icon = \"file-text\"\n\n def build_config(self) -> Dict[str, Any]:\n return {\n \"path\": {\n \"display_name\": \"Path\",\n \"field_type\": \"file\",\n \"file_types\": TEXT_FILE_TYPES,\n \"info\": f\"Supported file types: {', '.join(TEXT_FILE_TYPES)}\",\n },\n \"silent_errors\": {\n \"display_name\": \"Silent Errors\",\n \"advanced\": True,\n \"info\": \"If true, errors will not raise an exception.\",\n },\n }\n\n def load_file(self, path: str, silent_errors: bool = False) -> Record:\n resolved_path = self.resolve_path(path)\n path_obj = Path(resolved_path)\n extension = path_obj.suffix[1:].lower()\n if extension == \"doc\":\n raise ValueError(\"doc files are not supported. Please save as .docx\")\n if extension not in TEXT_FILE_TYPES:\n raise ValueError(f\"Unsupported file type: {extension}\")\n record = parse_text_file_to_record(resolved_path, silent_errors)\n self.status = record if record else \"No data\"\n return record or Record()\n\n def build(\n self,\n path: str,\n silent_errors: bool = False,\n ) -> Record:\n record = self.load_file(path, silent_errors)\n self.status = record\n return record\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "silent_errors": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "silent_errors",
- "display_name": "Silent Errors",
- "advanced": true,
- "dynamic": false,
- "info": "If true, errors will not raise an exception.",
- "load_from_db": false,
- "title_case": false
- },
- "_type": "CustomComponent"
- },
- "description": "A generic file loader.",
- "icon": "file-text",
- "base_classes": [
- "Record"
- ],
- "display_name": "File",
- "documentation": "",
- "custom_fields": {
- "path": null,
- "silent_errors": null
- },
- "output_types": [
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "File-t0a6a"
- },
- "selected": false,
- "width": 384,
- "height": 281,
- "positionAbsolute": {
- "x": 2257.233450682836,
- "y": 1747.5389618367233
- },
- "dragging": false
+ "load_from_db": false,
+ "title_case": false
+ },
+ "_type": "CustomComponent"
},
- {
- "id": "RecursiveCharacterTextSplitter-tR9QM",
- "type": "genericNode",
- "position": {
- "x": 2791.013514133929,
- "y": 1462.9588953494142
- },
- "data": {
- "type": "RecursiveCharacterTextSplitter",
- "node": {
- "template": {
- "inputs": {
- "type": "Document",
- "required": true,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "inputs",
- "display_name": "Input",
- "advanced": false,
- "input_types": [
- "Document",
- "Record"
- ],
- "dynamic": false,
- "info": "The texts to split.",
- "load_from_db": false,
- "title_case": false
- },
- "chunk_overlap": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 200,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "chunk_overlap",
- "display_name": "Chunk Overlap",
- "advanced": false,
- "dynamic": false,
- "info": "The amount of overlap between chunks.",
- "load_from_db": false,
- "title_case": false
- },
- "chunk_size": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 1000,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "chunk_size",
- "display_name": "Chunk Size",
- "advanced": false,
- "dynamic": false,
- "info": "The maximum length of each chunk.",
- "load_from_db": false,
- "title_case": false
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Optional\n\nfrom langchain_core.documents import Document\nfrom langchain_text_splitters import RecursiveCharacterTextSplitter\n\nfrom langflow.custom import CustomComponent\nfrom langflow.schema import Record\nfrom langflow.utils.util import build_loader_repr_from_records, unescape_string\n\n\nclass RecursiveCharacterTextSplitterComponent(CustomComponent):\n display_name: str = \"Recursive Character Text Splitter\"\n description: str = \"Split text into chunks of a specified length.\"\n documentation: str = \"https://docs.langflow.org/components/text-splitters#recursivecharactertextsplitter\"\n\n def build_config(self):\n return {\n \"inputs\": {\n \"display_name\": \"Input\",\n \"info\": \"The texts to split.\",\n \"input_types\": [\"Document\", \"Record\"],\n },\n \"separators\": {\n \"display_name\": \"Separators\",\n \"info\": 'The characters to split on.\\nIf left empty defaults to [\"\\\\n\\\\n\", \"\\\\n\", \" \", \"\"].',\n \"is_list\": True,\n },\n \"chunk_size\": {\n \"display_name\": \"Chunk Size\",\n \"info\": \"The maximum length of each chunk.\",\n \"field_type\": \"int\",\n \"value\": 1000,\n },\n \"chunk_overlap\": {\n \"display_name\": \"Chunk Overlap\",\n \"info\": \"The amount of overlap between chunks.\",\n \"field_type\": \"int\",\n \"value\": 200,\n },\n \"code\": {\"show\": False},\n }\n\n def build(\n self,\n inputs: list[Document],\n separators: Optional[list[str]] = None,\n chunk_size: Optional[int] = 1000,\n chunk_overlap: Optional[int] = 200,\n ) -> list[Record]:\n \"\"\"\n Split text into chunks of a specified length.\n\n Args:\n separators (list[str]): The characters to split on.\n chunk_size (int): The maximum length of each chunk.\n chunk_overlap (int): The amount of overlap between chunks.\n length_function (function): The function to use to calculate the length of the text.\n\n Returns:\n list[str]: The chunks of text.\n \"\"\"\n\n if separators == \"\":\n separators = None\n elif separators:\n # check if the separators list has escaped characters\n # if there are escaped characters, unescape them\n separators = [unescape_string(x) for x in separators]\n\n # Make sure chunk_size and chunk_overlap are ints\n if isinstance(chunk_size, str):\n chunk_size = int(chunk_size)\n if isinstance(chunk_overlap, str):\n chunk_overlap = int(chunk_overlap)\n splitter = RecursiveCharacterTextSplitter(\n separators=separators,\n chunk_size=chunk_size,\n chunk_overlap=chunk_overlap,\n )\n documents = []\n for _input in inputs:\n if isinstance(_input, Record):\n documents.append(_input.to_lc_document())\n else:\n documents.append(_input)\n docs = splitter.split_documents(documents)\n records = self.to_records(docs)\n self.repr_value = build_loader_repr_from_records(records)\n return records\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "separators": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "separators",
- "display_name": "Separators",
- "advanced": false,
- "dynamic": false,
- "info": "The characters to split on.\nIf left empty defaults to [\"\\n\\n\", \"\\n\", \" \", \"\"].",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": [
- ""
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Split text into chunks of a specified length.",
- "base_classes": [
- "Record"
- ],
- "display_name": "Recursive Character Text Splitter",
- "documentation": "https://docs.langflow.org/components/text-splitters#recursivecharactertextsplitter",
- "custom_fields": {
- "inputs": null,
- "separators": null,
- "chunk_size": null,
- "chunk_overlap": null
- },
- "output_types": [
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "RecursiveCharacterTextSplitter-tR9QM"
- },
- "selected": false,
- "width": 384,
- "height": 501,
- "positionAbsolute": {
- "x": 2791.013514133929,
- "y": 1462.9588953494142
- },
- "dragging": false
+ "description": "Generates text using OpenAI LLMs.",
+ "icon": "OpenAI",
+ "base_classes": ["object", "Text", "str"],
+ "display_name": "OpenAI",
+ "documentation": "",
+ "custom_fields": {
+ "input_value": null,
+ "openai_api_key": null,
+ "temperature": null,
+ "model_name": null,
+ "max_tokens": null,
+ "model_kwargs": null,
+ "openai_api_base": null,
+ "stream": null,
+ "system_message": null
},
- {
- "id": "AstraDBSearch-41nRz",
- "type": "genericNode",
- "position": {
- "x": 1723.976434815103,
- "y": 277.03317407245913
- },
- "data": {
- "type": "AstraDBSearch",
- "node": {
- "template": {
- "embedding": {
- "type": "Embeddings",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "embedding",
- "display_name": "Embedding",
- "advanced": false,
- "dynamic": false,
- "info": "Embedding to use",
- "load_from_db": false,
- "title_case": false
- },
- "input_value": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "input_value",
- "display_name": "Input Value",
- "advanced": false,
- "dynamic": false,
- "info": "Input value to search",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "api_endpoint": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "api_endpoint",
- "display_name": "API Endpoint",
- "advanced": false,
- "dynamic": false,
- "info": "API endpoint URL for the Astra DB service.",
- "load_from_db": true,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "ASTRA_DB_API_ENDPOINT"
- },
- "batch_size": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "batch_size",
- "display_name": "Batch Size",
- "advanced": true,
- "dynamic": false,
- "info": "Optional number of records to process in a single batch.",
- "load_from_db": false,
- "title_case": false
- },
- "bulk_delete_concurrency": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "bulk_delete_concurrency",
- "display_name": "Bulk Delete Concurrency",
- "advanced": true,
- "dynamic": false,
- "info": "Optional concurrency level for bulk delete operations.",
- "load_from_db": false,
- "title_case": false
- },
- "bulk_insert_batch_concurrency": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "bulk_insert_batch_concurrency",
- "display_name": "Bulk Insert Batch Concurrency",
- "advanced": true,
- "dynamic": false,
- "info": "Optional concurrency level for bulk insert operations.",
- "load_from_db": false,
- "title_case": false
- },
- "bulk_insert_overwrite_concurrency": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "bulk_insert_overwrite_concurrency",
- "display_name": "Bulk Insert Overwrite Concurrency",
- "advanced": true,
- "dynamic": false,
- "info": "Optional concurrency level for bulk insert operations that overwrite existing records.",
- "load_from_db": false,
- "title_case": false
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import List, Optional\n\nfrom langflow.components.vectorstores.AstraDB import AstraDBVectorStoreComponent\nfrom langflow.components.vectorstores.base.model import LCVectorStoreComponent\nfrom langflow.field_typing import Embeddings, Text\nfrom langflow.schema import Record\n\n\nclass AstraDBSearchComponent(LCVectorStoreComponent):\n display_name = \"Astra DB Search\"\n description = \"Searches an existing Astra DB Vector Store.\"\n icon = \"AstraDB\"\n field_order = [\"token\", \"api_endpoint\", \"collection_name\", \"input_value\", \"embedding\"]\n\n def build_config(self):\n return {\n \"search_type\": {\n \"display_name\": \"Search Type\",\n \"options\": [\"Similarity\", \"MMR\"],\n },\n \"input_value\": {\n \"display_name\": \"Input Value\",\n \"info\": \"Input value to search\",\n },\n \"embedding\": {\"display_name\": \"Embedding\", \"info\": \"Embedding to use\"},\n \"collection_name\": {\n \"display_name\": \"Collection Name\",\n \"info\": \"The name of the collection within Astra DB where the vectors will be stored.\",\n },\n \"token\": {\n \"display_name\": \"Token\",\n \"info\": \"Authentication token for accessing Astra DB.\",\n \"password\": True,\n },\n \"api_endpoint\": {\n \"display_name\": \"API Endpoint\",\n \"info\": \"API endpoint URL for the Astra DB service.\",\n },\n \"namespace\": {\n \"display_name\": \"Namespace\",\n \"info\": \"Optional namespace within Astra DB to use for the collection.\",\n \"advanced\": True,\n },\n \"metric\": {\n \"display_name\": \"Metric\",\n \"info\": \"Optional distance metric for vector comparisons in the vector store.\",\n \"advanced\": True,\n },\n \"batch_size\": {\n \"display_name\": \"Batch Size\",\n \"info\": \"Optional number of records to process in a single batch.\",\n \"advanced\": True,\n },\n \"bulk_insert_batch_concurrency\": {\n \"display_name\": \"Bulk Insert Batch Concurrency\",\n \"info\": \"Optional concurrency level for bulk insert operations.\",\n \"advanced\": True,\n },\n \"bulk_insert_overwrite_concurrency\": {\n \"display_name\": \"Bulk Insert Overwrite Concurrency\",\n \"info\": \"Optional concurrency level for bulk insert operations that overwrite existing records.\",\n \"advanced\": True,\n },\n \"bulk_delete_concurrency\": {\n \"display_name\": \"Bulk Delete Concurrency\",\n \"info\": \"Optional concurrency level for bulk delete operations.\",\n \"advanced\": True,\n },\n \"setup_mode\": {\n \"display_name\": \"Setup Mode\",\n \"info\": \"Configuration mode for setting up the vector store, with options like \u201cSync\u201d, \u201cAsync\u201d, or \u201cOff\u201d.\",\n \"options\": [\"Sync\", \"Async\", \"Off\"],\n \"advanced\": True,\n },\n \"pre_delete_collection\": {\n \"display_name\": \"Pre Delete Collection\",\n \"info\": \"Boolean flag to determine whether to delete the collection before creating a new one.\",\n \"advanced\": True,\n },\n \"metadata_indexing_include\": {\n \"display_name\": \"Metadata Indexing Include\",\n \"info\": \"Optional list of metadata fields to include in the indexing.\",\n \"advanced\": True,\n },\n \"metadata_indexing_exclude\": {\n \"display_name\": \"Metadata Indexing Exclude\",\n \"info\": \"Optional list of metadata fields to exclude from the indexing.\",\n \"advanced\": True,\n },\n \"collection_indexing_policy\": {\n \"display_name\": \"Collection Indexing Policy\",\n \"info\": \"Optional dictionary defining the indexing policy for the collection.\",\n \"advanced\": True,\n },\n \"number_of_results\": {\n \"display_name\": \"Number of Results\",\n \"info\": \"Number of results to return.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n embedding: Embeddings,\n collection_name: str,\n input_value: Text,\n token: str,\n api_endpoint: str,\n search_type: str = \"Similarity\",\n number_of_results: int = 4,\n namespace: Optional[str] = None,\n metric: Optional[str] = None,\n batch_size: Optional[int] = None,\n bulk_insert_batch_concurrency: Optional[int] = None,\n bulk_insert_overwrite_concurrency: Optional[int] = None,\n bulk_delete_concurrency: Optional[int] = None,\n setup_mode: str = \"Sync\",\n pre_delete_collection: bool = False,\n metadata_indexing_include: Optional[List[str]] = None,\n metadata_indexing_exclude: Optional[List[str]] = None,\n collection_indexing_policy: Optional[dict] = None,\n ) -> List[Record]:\n vector_store = AstraDBVectorStoreComponent().build(\n embedding=embedding,\n collection_name=collection_name,\n token=token,\n api_endpoint=api_endpoint,\n namespace=namespace,\n metric=metric,\n batch_size=batch_size,\n bulk_insert_batch_concurrency=bulk_insert_batch_concurrency,\n bulk_insert_overwrite_concurrency=bulk_insert_overwrite_concurrency,\n bulk_delete_concurrency=bulk_delete_concurrency,\n setup_mode=setup_mode,\n pre_delete_collection=pre_delete_collection,\n metadata_indexing_include=metadata_indexing_include,\n metadata_indexing_exclude=metadata_indexing_exclude,\n collection_indexing_policy=collection_indexing_policy,\n )\n try:\n return self.search_with_vector_store(input_value, search_type, vector_store, k=number_of_results)\n except KeyError as e:\n if \"content\" in str(e):\n raise ValueError(\n \"You should ingest data through Langflow (or LangChain) to query it in Langflow. Your collection does not contain a field name 'content'.\"\n )\n else:\n raise e\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "collection_indexing_policy": {
- "type": "dict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "collection_indexing_policy",
- "display_name": "Collection Indexing Policy",
- "advanced": true,
- "dynamic": false,
- "info": "Optional dictionary defining the indexing policy for the collection.",
- "load_from_db": false,
- "title_case": false
- },
- "collection_name": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "collection_name",
- "display_name": "Collection Name",
- "advanced": false,
- "dynamic": false,
- "info": "The name of the collection within Astra DB where the vectors will be stored.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "langflow"
- },
- "metadata_indexing_exclude": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "metadata_indexing_exclude",
- "display_name": "Metadata Indexing Exclude",
- "advanced": true,
- "dynamic": false,
- "info": "Optional list of metadata fields to exclude from the indexing.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "metadata_indexing_include": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "metadata_indexing_include",
- "display_name": "Metadata Indexing Include",
- "advanced": true,
- "dynamic": false,
- "info": "Optional list of metadata fields to include in the indexing.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "metric": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "metric",
- "display_name": "Metric",
- "advanced": true,
- "dynamic": false,
- "info": "Optional distance metric for vector comparisons in the vector store.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "namespace": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "namespace",
- "display_name": "Namespace",
- "advanced": true,
- "dynamic": false,
- "info": "Optional namespace within Astra DB to use for the collection.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "number_of_results": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 4,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "number_of_results",
- "display_name": "Number of Results",
- "advanced": true,
- "dynamic": false,
- "info": "Number of results to return.",
- "load_from_db": false,
- "title_case": false
- },
- "pre_delete_collection": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "pre_delete_collection",
- "display_name": "Pre Delete Collection",
- "advanced": true,
- "dynamic": false,
- "info": "Boolean flag to determine whether to delete the collection before creating a new one.",
- "load_from_db": false,
- "title_case": false
- },
- "search_type": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "Similarity",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "Similarity",
- "MMR"
- ],
- "name": "search_type",
- "display_name": "Search Type",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "setup_mode": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "Sync",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "Sync",
- "Async",
- "Off"
- ],
- "name": "setup_mode",
- "display_name": "Setup Mode",
- "advanced": true,
- "dynamic": false,
- "info": "Configuration mode for setting up the vector store, with options like \u201cSync\u201d, \u201cAsync\u201d, or \u201cOff\u201d.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "token": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "token",
- "display_name": "Token",
- "advanced": false,
- "dynamic": false,
- "info": "Authentication token for accessing Astra DB.",
- "load_from_db": true,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "ASTRA_DB_APPLICATION_TOKEN"
- },
- "_type": "CustomComponent"
- },
- "description": "Searches an existing Astra DB Vector Store.",
- "icon": "AstraDB",
- "base_classes": [
- "Record"
- ],
- "display_name": "Astra DB Search",
- "documentation": "",
- "custom_fields": {
- "embedding": null,
- "collection_name": null,
- "input_value": null,
- "token": null,
- "api_endpoint": null,
- "search_type": null,
- "number_of_results": null,
- "namespace": null,
- "metric": null,
- "batch_size": null,
- "bulk_insert_batch_concurrency": null,
- "bulk_insert_overwrite_concurrency": null,
- "bulk_delete_concurrency": null,
- "setup_mode": null,
- "pre_delete_collection": null,
- "metadata_indexing_include": null,
- "metadata_indexing_exclude": null,
- "collection_indexing_policy": null
- },
- "output_types": [
- "Record"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [
- "token",
- "api_endpoint",
- "collection_name",
- "input_value",
- "embedding"
- ],
- "beta": false
- },
- "id": "AstraDBSearch-41nRz"
- },
- "selected": false,
- "width": 384,
- "height": 713,
- "dragging": false,
- "positionAbsolute": {
- "x": 1723.976434815103,
- "y": 277.03317407245913
- }
+ "output_types": ["Text"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [
+ "max_tokens",
+ "model_kwargs",
+ "model_name",
+ "openai_api_base",
+ "openai_api_key",
+ "temperature",
+ "input_value",
+ "system_message",
+ "stream"
+ ],
+ "beta": false
+ },
+ "id": "OpenAIModel-EjXlN"
+ },
+ "selected": true,
+ "width": 384,
+ "height": 563,
+ "positionAbsolute": {
+ "x": 3410.117202077183,
+ "y": 431.2038048137648
+ },
+ "dragging": false
+ },
+ {
+ "id": "Prompt-xeI6K",
+ "type": "genericNode",
+ "position": {
+ "x": 2969.0261961391298,
+ "y": 442.1613649809069
+ },
+ "data": {
+ "type": "Prompt",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from langchain_core.prompts import PromptTemplate\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Prompt, TemplateField, Text\n\n\nclass PromptComponent(CustomComponent):\n display_name: str = \"Prompt\"\n description: str = \"Create a prompt template with dynamic variables.\"\n icon = \"prompts\"\n\n def build_config(self):\n return {\n \"template\": TemplateField(display_name=\"Template\"),\n \"code\": TemplateField(advanced=True),\n }\n\n def build(\n self,\n template: Prompt,\n **kwargs,\n ) -> Text:\n from langflow.base.prompts.utils import dict_values_to_string\n\n prompt_template = PromptTemplate.from_template(Text(template))\n kwargs = dict_values_to_string(kwargs)\n kwargs = {k: \"\\n\".join(v) if isinstance(v, list) else v for k, v in kwargs.items()}\n try:\n formated_prompt = prompt_template.format(**kwargs)\n except Exception as exc:\n raise ValueError(f\"Error formatting prompt: {exc}\") from exc\n self.status = f'Prompt:\\n\"{formated_prompt}\"'\n return formated_prompt\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "template": {
+ "type": "prompt",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "{context}\n\n---\n\nGiven the context above, answer the question as best as possible.\n\nQuestion: {question}\n\nAnswer: ",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "template",
+ "display_name": "Template",
+ "advanced": false,
+ "input_types": ["Text"],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "_type": "CustomComponent",
+ "context": {
+ "field_type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "context",
+ "display_name": "context",
+ "advanced": false,
+ "input_types": [
+ "Document",
+ "BaseOutputParser",
+ "Record",
+ "Text"
+ ],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "type": "str"
+ },
+ "question": {
+ "field_type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "question",
+ "display_name": "question",
+ "advanced": false,
+ "input_types": [
+ "Document",
+ "BaseOutputParser",
+ "Record",
+ "Text"
+ ],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "type": "str"
+ }
},
- {
- "id": "AstraDB-eUCSS",
- "type": "genericNode",
- "position": {
- "x": 3372.04958055989,
- "y": 1611.0742035495277
- },
- "data": {
- "type": "AstraDB",
- "node": {
- "template": {
- "embedding": {
- "type": "Embeddings",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "embedding",
- "display_name": "Embedding",
- "advanced": false,
- "dynamic": false,
- "info": "Embedding to use",
- "load_from_db": false,
- "title_case": false
- },
- "inputs": {
- "type": "Record",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "inputs",
- "display_name": "Inputs",
- "advanced": false,
- "dynamic": false,
- "info": "Optional list of records to be processed and stored in the vector store.",
- "load_from_db": false,
- "title_case": false
- },
- "api_endpoint": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "api_endpoint",
- "display_name": "API Endpoint",
- "advanced": false,
- "dynamic": false,
- "info": "API endpoint URL for the Astra DB service.",
- "load_from_db": true,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "ASTRA_DB_API_ENDPOINT"
- },
- "batch_size": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "batch_size",
- "display_name": "Batch Size",
- "advanced": true,
- "dynamic": false,
- "info": "Optional number of records to process in a single batch.",
- "load_from_db": false,
- "title_case": false
- },
- "bulk_delete_concurrency": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "bulk_delete_concurrency",
- "display_name": "Bulk Delete Concurrency",
- "advanced": true,
- "dynamic": false,
- "info": "Optional concurrency level for bulk delete operations.",
- "load_from_db": false,
- "title_case": false
- },
- "bulk_insert_batch_concurrency": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "bulk_insert_batch_concurrency",
- "display_name": "Bulk Insert Batch Concurrency",
- "advanced": true,
- "dynamic": false,
- "info": "Optional concurrency level for bulk insert operations.",
- "load_from_db": false,
- "title_case": false
- },
- "bulk_insert_overwrite_concurrency": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "bulk_insert_overwrite_concurrency",
- "display_name": "Bulk Insert Overwrite Concurrency",
- "advanced": true,
- "dynamic": false,
- "info": "Optional concurrency level for bulk insert operations that overwrite existing records.",
- "load_from_db": false,
- "title_case": false
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import List, Optional, Union\nfrom langchain_astradb import AstraDBVectorStore\nfrom langchain_astradb.utils.astradb import SetupMode\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Embeddings, VectorStore\nfrom langflow.schema import Record\nfrom langchain_core.retrievers import BaseRetriever\n\n\nclass AstraDBVectorStoreComponent(CustomComponent):\n display_name = \"Astra DB\"\n description = \"Builds or loads an Astra DB Vector Store.\"\n icon = \"AstraDB\"\n field_order = [\"token\", \"api_endpoint\", \"collection_name\", \"inputs\", \"embedding\"]\n\n def build_config(self):\n return {\n \"inputs\": {\n \"display_name\": \"Inputs\",\n \"info\": \"Optional list of records to be processed and stored in the vector store.\",\n },\n \"embedding\": {\"display_name\": \"Embedding\", \"info\": \"Embedding to use\"},\n \"collection_name\": {\n \"display_name\": \"Collection Name\",\n \"info\": \"The name of the collection within Astra DB where the vectors will be stored.\",\n },\n \"token\": {\n \"display_name\": \"Token\",\n \"info\": \"Authentication token for accessing Astra DB.\",\n \"password\": True,\n },\n \"api_endpoint\": {\n \"display_name\": \"API Endpoint\",\n \"info\": \"API endpoint URL for the Astra DB service.\",\n },\n \"namespace\": {\n \"display_name\": \"Namespace\",\n \"info\": \"Optional namespace within Astra DB to use for the collection.\",\n \"advanced\": True,\n },\n \"metric\": {\n \"display_name\": \"Metric\",\n \"info\": \"Optional distance metric for vector comparisons in the vector store.\",\n \"advanced\": True,\n },\n \"batch_size\": {\n \"display_name\": \"Batch Size\",\n \"info\": \"Optional number of records to process in a single batch.\",\n \"advanced\": True,\n },\n \"bulk_insert_batch_concurrency\": {\n \"display_name\": \"Bulk Insert Batch Concurrency\",\n \"info\": \"Optional concurrency level for bulk insert operations.\",\n \"advanced\": True,\n },\n \"bulk_insert_overwrite_concurrency\": {\n \"display_name\": \"Bulk Insert Overwrite Concurrency\",\n \"info\": \"Optional concurrency level for bulk insert operations that overwrite existing records.\",\n \"advanced\": True,\n },\n \"bulk_delete_concurrency\": {\n \"display_name\": \"Bulk Delete Concurrency\",\n \"info\": \"Optional concurrency level for bulk delete operations.\",\n \"advanced\": True,\n },\n \"setup_mode\": {\n \"display_name\": \"Setup Mode\",\n \"info\": \"Configuration mode for setting up the vector store, with options like \u201cSync\u201d, \u201cAsync\u201d, or \u201cOff\u201d.\",\n \"options\": [\"Sync\", \"Async\", \"Off\"],\n \"advanced\": True,\n },\n \"pre_delete_collection\": {\n \"display_name\": \"Pre Delete Collection\",\n \"info\": \"Boolean flag to determine whether to delete the collection before creating a new one.\",\n \"advanced\": True,\n },\n \"metadata_indexing_include\": {\n \"display_name\": \"Metadata Indexing Include\",\n \"info\": \"Optional list of metadata fields to include in the indexing.\",\n \"advanced\": True,\n },\n \"metadata_indexing_exclude\": {\n \"display_name\": \"Metadata Indexing Exclude\",\n \"info\": \"Optional list of metadata fields to exclude from the indexing.\",\n \"advanced\": True,\n },\n \"collection_indexing_policy\": {\n \"display_name\": \"Collection Indexing Policy\",\n \"info\": \"Optional dictionary defining the indexing policy for the collection.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n embedding: Embeddings,\n token: str,\n api_endpoint: str,\n collection_name: str,\n inputs: Optional[List[Record]] = None,\n namespace: Optional[str] = None,\n metric: Optional[str] = None,\n batch_size: Optional[int] = None,\n bulk_insert_batch_concurrency: Optional[int] = None,\n bulk_insert_overwrite_concurrency: Optional[int] = None,\n bulk_delete_concurrency: Optional[int] = None,\n setup_mode: str = \"Sync\",\n pre_delete_collection: bool = False,\n metadata_indexing_include: Optional[List[str]] = None,\n metadata_indexing_exclude: Optional[List[str]] = None,\n collection_indexing_policy: Optional[dict] = None,\n ) -> Union[VectorStore, BaseRetriever]:\n try:\n setup_mode_value = SetupMode[setup_mode.upper()]\n except KeyError:\n raise ValueError(f\"Invalid setup mode: {setup_mode}\")\n if inputs:\n documents = [_input.to_lc_document() for _input in inputs]\n\n vector_store = AstraDBVectorStore.from_documents(\n documents=documents,\n embedding=embedding,\n collection_name=collection_name,\n token=token,\n api_endpoint=api_endpoint,\n namespace=namespace,\n metric=metric,\n batch_size=batch_size,\n bulk_insert_batch_concurrency=bulk_insert_batch_concurrency,\n bulk_insert_overwrite_concurrency=bulk_insert_overwrite_concurrency,\n bulk_delete_concurrency=bulk_delete_concurrency,\n setup_mode=setup_mode_value,\n pre_delete_collection=pre_delete_collection,\n metadata_indexing_include=metadata_indexing_include,\n metadata_indexing_exclude=metadata_indexing_exclude,\n collection_indexing_policy=collection_indexing_policy,\n )\n else:\n vector_store = AstraDBVectorStore(\n embedding=embedding,\n collection_name=collection_name,\n token=token,\n api_endpoint=api_endpoint,\n namespace=namespace,\n metric=metric,\n batch_size=batch_size,\n bulk_insert_batch_concurrency=bulk_insert_batch_concurrency,\n bulk_insert_overwrite_concurrency=bulk_insert_overwrite_concurrency,\n bulk_delete_concurrency=bulk_delete_concurrency,\n setup_mode=setup_mode_value,\n pre_delete_collection=pre_delete_collection,\n metadata_indexing_include=metadata_indexing_include,\n metadata_indexing_exclude=metadata_indexing_exclude,\n collection_indexing_policy=collection_indexing_policy,\n )\n\n return vector_store\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "collection_indexing_policy": {
- "type": "dict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "collection_indexing_policy",
- "display_name": "Collection Indexing Policy",
- "advanced": true,
- "dynamic": false,
- "info": "Optional dictionary defining the indexing policy for the collection.",
- "load_from_db": false,
- "title_case": false
- },
- "collection_name": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "collection_name",
- "display_name": "Collection Name",
- "advanced": false,
- "dynamic": false,
- "info": "The name of the collection within Astra DB where the vectors will be stored.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "langflow"
- },
- "metadata_indexing_exclude": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "metadata_indexing_exclude",
- "display_name": "Metadata Indexing Exclude",
- "advanced": true,
- "dynamic": false,
- "info": "Optional list of metadata fields to exclude from the indexing.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "metadata_indexing_include": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "metadata_indexing_include",
- "display_name": "Metadata Indexing Include",
- "advanced": true,
- "dynamic": false,
- "info": "Optional list of metadata fields to include in the indexing.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "metric": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "metric",
- "display_name": "Metric",
- "advanced": true,
- "dynamic": false,
- "info": "Optional distance metric for vector comparisons in the vector store.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "namespace": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "namespace",
- "display_name": "Namespace",
- "advanced": true,
- "dynamic": false,
- "info": "Optional namespace within Astra DB to use for the collection.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "pre_delete_collection": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "pre_delete_collection",
- "display_name": "Pre Delete Collection",
- "advanced": true,
- "dynamic": false,
- "info": "Boolean flag to determine whether to delete the collection before creating a new one.",
- "load_from_db": false,
- "title_case": false
- },
- "setup_mode": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "Sync",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "Sync",
- "Async",
- "Off"
- ],
- "name": "setup_mode",
- "display_name": "Setup Mode",
- "advanced": true,
- "dynamic": false,
- "info": "Configuration mode for setting up the vector store, with options like \u201cSync\u201d, \u201cAsync\u201d, or \u201cOff\u201d.",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "token": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "token",
- "display_name": "Token",
- "advanced": false,
- "dynamic": false,
- "info": "Authentication token for accessing Astra DB.",
- "load_from_db": true,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "ASTRA_DB_APPLICATION_TOKEN"
- },
- "_type": "CustomComponent"
- },
- "description": "Builds or loads an Astra DB Vector Store.",
- "icon": "AstraDB",
- "base_classes": [
- "VectorStore"
- ],
- "display_name": "Astra DB",
- "documentation": "",
- "custom_fields": {
- "embedding": null,
- "token": null,
- "api_endpoint": null,
- "collection_name": null,
- "inputs": null,
- "namespace": null,
- "metric": null,
- "batch_size": null,
- "bulk_insert_batch_concurrency": null,
- "bulk_insert_overwrite_concurrency": null,
- "bulk_delete_concurrency": null,
- "setup_mode": null,
- "pre_delete_collection": null,
- "metadata_indexing_include": null,
- "metadata_indexing_exclude": null,
- "collection_indexing_policy": null
- },
- "output_types": [
- "VectorStore"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [
- "token",
- "api_endpoint",
- "collection_name",
- "inputs",
- "embedding"
- ],
- "beta": false
- },
- "id": "AstraDB-eUCSS"
- },
- "selected": false,
- "width": 384,
- "height": 573,
- "positionAbsolute": {
- "x": 3372.04958055989,
- "y": 1611.0742035495277
- },
- "dragging": false
+ "description": "Create a prompt template with dynamic variables.",
+ "icon": "prompts",
+ "is_input": null,
+ "is_output": null,
+ "is_composition": null,
+ "base_classes": ["object", "Text", "str"],
+ "name": "",
+ "display_name": "Prompt",
+ "documentation": "",
+ "custom_fields": {
+ "template": ["context", "question"]
},
- {
- "id": "OpenAIEmbeddings-9TPjc",
- "type": "genericNode",
- "position": {
- "x": 2814.0402191223047,
- "y": 1955.9268168273086
- },
- "data": {
- "type": "OpenAIEmbeddings",
- "node": {
- "template": {
- "allowed_special": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": [],
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "allowed_special",
- "display_name": "Allowed Special",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "chunk_size": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 1000,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "chunk_size",
- "display_name": "Chunk Size",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "client": {
- "type": "Any",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "client",
- "display_name": "Client",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "code": {
- "type": "code",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": true,
- "value": "from typing import Dict, List, Optional\n\nfrom langchain_openai.embeddings.base import OpenAIEmbeddings\nfrom pydantic.v1 import SecretStr\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Embeddings, NestedDict\n\n\nclass OpenAIEmbeddingsComponent(CustomComponent):\n display_name = \"OpenAI Embeddings\"\n description = \"Generate embeddings using OpenAI models.\"\n\n def build_config(self):\n return {\n \"allowed_special\": {\n \"display_name\": \"Allowed Special\",\n \"advanced\": True,\n \"field_type\": \"str\",\n \"is_list\": True,\n },\n \"default_headers\": {\n \"display_name\": \"Default Headers\",\n \"advanced\": True,\n \"field_type\": \"dict\",\n },\n \"default_query\": {\n \"display_name\": \"Default Query\",\n \"advanced\": True,\n \"field_type\": \"NestedDict\",\n },\n \"disallowed_special\": {\n \"display_name\": \"Disallowed Special\",\n \"advanced\": True,\n \"field_type\": \"str\",\n \"is_list\": True,\n },\n \"chunk_size\": {\"display_name\": \"Chunk Size\", \"advanced\": True},\n \"client\": {\"display_name\": \"Client\", \"advanced\": True},\n \"deployment\": {\"display_name\": \"Deployment\", \"advanced\": True},\n \"embedding_ctx_length\": {\n \"display_name\": \"Embedding Context Length\",\n \"advanced\": True,\n },\n \"max_retries\": {\"display_name\": \"Max Retries\", \"advanced\": True},\n \"model\": {\n \"display_name\": \"Model\",\n \"advanced\": False,\n \"options\": [\n \"text-embedding-3-small\",\n \"text-embedding-3-large\",\n \"text-embedding-ada-002\",\n ],\n },\n \"model_kwargs\": {\"display_name\": \"Model Kwargs\", \"advanced\": True},\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"password\": True,\n \"advanced\": True,\n },\n \"openai_api_key\": {\"display_name\": \"OpenAI API Key\", \"password\": True},\n \"openai_api_type\": {\n \"display_name\": \"OpenAI API Type\",\n \"advanced\": True,\n \"password\": True,\n },\n \"openai_api_version\": {\n \"display_name\": \"OpenAI API Version\",\n \"advanced\": True,\n },\n \"openai_organization\": {\n \"display_name\": \"OpenAI Organization\",\n \"advanced\": True,\n },\n \"openai_proxy\": {\"display_name\": \"OpenAI Proxy\", \"advanced\": True},\n \"request_timeout\": {\"display_name\": \"Request Timeout\", \"advanced\": True},\n \"show_progress_bar\": {\n \"display_name\": \"Show Progress Bar\",\n \"advanced\": True,\n },\n \"skip_empty\": {\"display_name\": \"Skip Empty\", \"advanced\": True},\n \"tiktoken_model_name\": {\n \"display_name\": \"TikToken Model Name\",\n \"advanced\": True,\n },\n \"tiktoken_enable\": {\"display_name\": \"TikToken Enable\", \"advanced\": True},\n }\n\n def build(\n self,\n openai_api_key: str,\n default_headers: Optional[Dict[str, str]] = None,\n default_query: Optional[NestedDict] = {},\n allowed_special: List[str] = [],\n disallowed_special: List[str] = [\"all\"],\n chunk_size: int = 1000,\n deployment: str = \"text-embedding-ada-002\",\n embedding_ctx_length: int = 8191,\n max_retries: int = 6,\n model: str = \"text-embedding-ada-002\",\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n openai_api_type: Optional[str] = None,\n openai_api_version: Optional[str] = None,\n openai_organization: Optional[str] = None,\n openai_proxy: Optional[str] = None,\n request_timeout: Optional[float] = None,\n show_progress_bar: bool = False,\n skip_empty: bool = False,\n tiktoken_enable: bool = True,\n tiktoken_model_name: Optional[str] = None,\n ) -> Embeddings:\n # This is to avoid errors with Vector Stores (e.g Chroma)\n if disallowed_special == [\"all\"]:\n disallowed_special = \"all\" # type: ignore\n if openai_api_key:\n api_key = SecretStr(openai_api_key)\n else:\n api_key = None\n\n return OpenAIEmbeddings(\n tiktoken_enabled=tiktoken_enable,\n default_headers=default_headers,\n default_query=default_query,\n allowed_special=set(allowed_special),\n disallowed_special=\"all\",\n chunk_size=chunk_size,\n deployment=deployment,\n embedding_ctx_length=embedding_ctx_length,\n max_retries=max_retries,\n model=model,\n model_kwargs=model_kwargs,\n base_url=openai_api_base,\n api_key=api_key,\n openai_api_type=openai_api_type,\n api_version=openai_api_version,\n organization=openai_organization,\n openai_proxy=openai_proxy,\n timeout=request_timeout,\n show_progress_bar=show_progress_bar,\n skip_empty=skip_empty,\n tiktoken_model_name=tiktoken_model_name,\n )\n",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "code",
- "advanced": true,
- "dynamic": true,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "default_headers": {
- "type": "dict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "default_headers",
- "display_name": "Default Headers",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "default_query": {
- "type": "NestedDict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": {},
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "default_query",
- "display_name": "Default Query",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "deployment": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": "text-embedding-ada-002",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "deployment",
- "display_name": "Deployment",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "disallowed_special": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": [
- "all"
- ],
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "disallowed_special",
- "display_name": "Disallowed Special",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "embedding_ctx_length": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 8191,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "embedding_ctx_length",
- "display_name": "Embedding Context Length",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "max_retries": {
- "type": "int",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": 6,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "max_retries",
- "display_name": "Max Retries",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "model": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": true,
- "show": true,
- "multiline": false,
- "value": "text-embedding-ada-002",
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "options": [
- "text-embedding-3-small",
- "text-embedding-3-large",
- "text-embedding-ada-002"
- ],
- "name": "model",
- "display_name": "Model",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "model_kwargs": {
- "type": "NestedDict",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": {},
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "model_kwargs",
- "display_name": "Model Kwargs",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "openai_api_base": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "openai_api_base",
- "display_name": "OpenAI API Base",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_key": {
- "type": "str",
- "required": true,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "openai_api_key",
- "display_name": "OpenAI API Key",
- "advanced": false,
- "dynamic": false,
- "info": "",
- "load_from_db": true,
- "title_case": false,
- "input_types": [
- "Text"
- ],
- "value": "OPENAI_API_KEY"
- },
- "openai_api_type": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": true,
- "name": "openai_api_type",
- "display_name": "OpenAI API Type",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_api_version": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "openai_api_version",
- "display_name": "OpenAI API Version",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_organization": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "openai_organization",
- "display_name": "OpenAI Organization",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "openai_proxy": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "openai_proxy",
- "display_name": "OpenAI Proxy",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "request_timeout": {
- "type": "float",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "request_timeout",
- "display_name": "Request Timeout",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "rangeSpec": {
- "step_type": "float",
- "min": -1,
- "max": 1,
- "step": 0.1
- },
- "load_from_db": false,
- "title_case": false
- },
- "show_progress_bar": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "show_progress_bar",
- "display_name": "Show Progress Bar",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "skip_empty": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "skip_empty",
- "display_name": "Skip Empty",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "tiktoken_enable": {
- "type": "bool",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "value": true,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "tiktoken_enable",
- "display_name": "TikToken Enable",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false
- },
- "tiktoken_model_name": {
- "type": "str",
- "required": false,
- "placeholder": "",
- "list": false,
- "show": true,
- "multiline": false,
- "fileTypes": [],
- "file_path": "",
- "password": false,
- "name": "tiktoken_model_name",
- "display_name": "TikToken Model Name",
- "advanced": true,
- "dynamic": false,
- "info": "",
- "load_from_db": false,
- "title_case": false,
- "input_types": [
- "Text"
- ]
- },
- "_type": "CustomComponent"
- },
- "description": "Generate embeddings using OpenAI models.",
- "base_classes": [
- "Embeddings"
- ],
- "display_name": "OpenAI Embeddings",
- "documentation": "",
- "custom_fields": {
- "openai_api_key": null,
- "default_headers": null,
- "default_query": null,
- "allowed_special": null,
- "disallowed_special": null,
- "chunk_size": null,
- "client": null,
- "deployment": null,
- "embedding_ctx_length": null,
- "max_retries": null,
- "model": null,
- "model_kwargs": null,
- "openai_api_base": null,
- "openai_api_type": null,
- "openai_api_version": null,
- "openai_organization": null,
- "openai_proxy": null,
- "request_timeout": null,
- "show_progress_bar": null,
- "skip_empty": null,
- "tiktoken_enable": null,
- "tiktoken_model_name": null
- },
- "output_types": [
- "Embeddings"
- ],
- "field_formatters": {},
- "frozen": false,
- "field_order": [],
- "beta": false
- },
- "id": "OpenAIEmbeddings-9TPjc"
- },
- "selected": false,
- "width": 384,
- "height": 383,
- "positionAbsolute": {
- "x": 2814.0402191223047,
- "y": 1955.9268168273086
- },
- "dragging": false
- }
- ],
- "edges": [
- {
- "source": "TextOutput-BDknO",
- "target": "Prompt-xeI6K",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153TextOutput\u0153,\u0153id\u0153:\u0153TextOutput-BDknO\u0153}",
- "targetHandle": "{\u0153fieldName\u0153:\u0153context\u0153,\u0153id\u0153:\u0153Prompt-xeI6K\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "id": "reactflow__edge-TextOutput-BDknO{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153TextOutput\u0153,\u0153id\u0153:\u0153TextOutput-BDknO\u0153}-Prompt-xeI6K{\u0153fieldName\u0153:\u0153context\u0153,\u0153id\u0153:\u0153Prompt-xeI6K\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "context",
- "id": "Prompt-xeI6K",
- "inputTypes": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "object",
- "Text",
- "str"
- ],
- "dataType": "TextOutput",
- "id": "TextOutput-BDknO"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "selected": false
+ "output_types": ["Text"],
+ "full_path": null,
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false,
+ "error": null
+ },
+ "id": "Prompt-xeI6K",
+ "description": "Create a prompt template with dynamic variables.",
+ "display_name": "Prompt"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 477,
+ "positionAbsolute": {
+ "x": 2969.0261961391298,
+ "y": 442.1613649809069
+ },
+ "dragging": false
+ },
+ {
+ "id": "ChatOutput-Q39I8",
+ "type": "genericNode",
+ "position": {
+ "x": 3887.2073667611485,
+ "y": 588.4801225794856
+ },
+ "data": {
+ "type": "ChatOutput",
+ "node": {
+ "template": {
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional, Union\n\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.field_typing import Text\nfrom langflow.schema import Record\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n\n def build(\n self,\n sender: Optional[str] = \"Machine\",\n sender_name: Optional[str] = \"AI\",\n input_value: Optional[str] = None,\n session_id: Optional[str] = None,\n return_record: Optional[bool] = False,\n record_template: Optional[str] = \"{text}\",\n ) -> Union[Text, Record]:\n return super().build_with_record(\n sender=sender,\n sender_name=sender_name,\n input_value=input_value,\n session_id=session_id,\n return_record=return_record,\n record_template=record_template or \"\",\n )\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "input_value": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Message",
+ "advanced": false,
+ "input_types": ["Text"],
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "record_template": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "{text}",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "record_template",
+ "display_name": "Record Template",
+ "advanced": true,
+ "dynamic": false,
+ "info": "In case of Message being a Record, this template will be used to convert it to text.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "return_record": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "return_record",
+ "display_name": "Return Record",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Return the message as a record containing the sender, sender_name, and session_id.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "sender": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "Machine",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["Machine", "User"],
+ "name": "sender",
+ "display_name": "Sender Type",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "sender_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "AI",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "sender_name",
+ "display_name": "Sender Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "session_id": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "session_id",
+ "display_name": "Session ID",
+ "advanced": true,
+ "dynamic": false,
+ "info": "If provided, the message will be stored in the memory.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
},
- {
- "source": "ChatInput-yxMKE",
- "target": "Prompt-xeI6K",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Text\u0153,\u0153str\u0153,\u0153object\u0153,\u0153Record\u0153],\u0153dataType\u0153:\u0153ChatInput\u0153,\u0153id\u0153:\u0153ChatInput-yxMKE\u0153}",
- "targetHandle": "{\u0153fieldName\u0153:\u0153question\u0153,\u0153id\u0153:\u0153Prompt-xeI6K\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "id": "reactflow__edge-ChatInput-yxMKE{\u0153baseClasses\u0153:[\u0153Text\u0153,\u0153str\u0153,\u0153object\u0153,\u0153Record\u0153],\u0153dataType\u0153:\u0153ChatInput\u0153,\u0153id\u0153:\u0153ChatInput-yxMKE\u0153}-Prompt-xeI6K{\u0153fieldName\u0153:\u0153question\u0153,\u0153id\u0153:\u0153Prompt-xeI6K\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153BaseOutputParser\u0153,\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "question",
- "id": "Prompt-xeI6K",
- "inputTypes": [
- "Document",
- "BaseOutputParser",
- "Record",
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "Text",
- "str",
- "object",
- "Record"
- ],
- "dataType": "ChatInput",
- "id": "ChatInput-yxMKE"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "selected": false
+ "description": "Display a chat message in the Playground.",
+ "icon": "ChatOutput",
+ "base_classes": ["object", "Text", "Record", "str"],
+ "display_name": "Chat Output",
+ "documentation": "",
+ "custom_fields": {
+ "sender": null,
+ "sender_name": null,
+ "input_value": null,
+ "session_id": null,
+ "return_record": null,
+ "record_template": null
},
- {
- "source": "Prompt-xeI6K",
- "target": "OpenAIModel-EjXlN",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-xeI6K\u0153}",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153OpenAIModel-EjXlN\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "id": "reactflow__edge-Prompt-xeI6K{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153Prompt\u0153,\u0153id\u0153:\u0153Prompt-xeI6K\u0153}-OpenAIModel-EjXlN{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153OpenAIModel-EjXlN\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "OpenAIModel-EjXlN",
- "inputTypes": [
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "object",
- "Text",
- "str"
- ],
- "dataType": "Prompt",
- "id": "Prompt-xeI6K"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "selected": false
+ "output_types": ["Text", "Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "ChatOutput-Q39I8"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 383,
+ "positionAbsolute": {
+ "x": 3887.2073667611485,
+ "y": 588.4801225794856
+ },
+ "dragging": false
+ },
+ {
+ "id": "File-t0a6a",
+ "type": "genericNode",
+ "position": {
+ "x": 2257.233450682836,
+ "y": 1747.5389618367233
+ },
+ "data": {
+ "type": "File",
+ "node": {
+ "template": {
+ "path": {
+ "type": "file",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [
+ ".txt",
+ ".md",
+ ".mdx",
+ ".csv",
+ ".json",
+ ".yaml",
+ ".yml",
+ ".xml",
+ ".html",
+ ".htm",
+ ".pdf",
+ ".docx",
+ ".py",
+ ".sh",
+ ".sql",
+ ".js",
+ ".ts",
+ ".tsx"
+ ],
+ "file_path": "51e2b78a-199b-4054-9f32-e288eef6924c/Langflow conversation.pdf",
+ "password": false,
+ "name": "path",
+ "display_name": "Path",
+ "advanced": false,
+ "dynamic": false,
+ "info": "Supported file types: txt, md, mdx, csv, json, yaml, yml, xml, html, htm, pdf, docx, py, sh, sql, js, ts, tsx",
+ "load_from_db": false,
+ "title_case": false,
+ "value": ""
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from pathlib import Path\nfrom typing import Any, Dict\n\nfrom langflow.base.data.utils import TEXT_FILE_TYPES, parse_text_file_to_record\nfrom langflow.custom import CustomComponent\nfrom langflow.schema import Record\n\n\nclass FileComponent(CustomComponent):\n display_name = \"File\"\n description = \"A generic file loader.\"\n icon = \"file-text\"\n\n def build_config(self) -> Dict[str, Any]:\n return {\n \"path\": {\n \"display_name\": \"Path\",\n \"field_type\": \"file\",\n \"file_types\": TEXT_FILE_TYPES,\n \"info\": f\"Supported file types: {', '.join(TEXT_FILE_TYPES)}\",\n },\n \"silent_errors\": {\n \"display_name\": \"Silent Errors\",\n \"advanced\": True,\n \"info\": \"If true, errors will not raise an exception.\",\n },\n }\n\n def load_file(self, path: str, silent_errors: bool = False) -> Record:\n resolved_path = self.resolve_path(path)\n path_obj = Path(resolved_path)\n extension = path_obj.suffix[1:].lower()\n if extension == \"doc\":\n raise ValueError(\"doc files are not supported. Please save as .docx\")\n if extension not in TEXT_FILE_TYPES:\n raise ValueError(f\"Unsupported file type: {extension}\")\n record = parse_text_file_to_record(resolved_path, silent_errors)\n self.status = record if record else \"No data\"\n return record or Record()\n\n def build(\n self,\n path: str,\n silent_errors: bool = False,\n ) -> Record:\n record = self.load_file(path, silent_errors)\n self.status = record\n return record\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "silent_errors": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "silent_errors",
+ "display_name": "Silent Errors",
+ "advanced": true,
+ "dynamic": false,
+ "info": "If true, errors will not raise an exception.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "_type": "CustomComponent"
},
- {
- "source": "OpenAIModel-EjXlN",
- "target": "ChatOutput-Q39I8",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153OpenAIModel\u0153,\u0153id\u0153:\u0153OpenAIModel-EjXlN\u0153}",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153ChatOutput-Q39I8\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "id": "reactflow__edge-OpenAIModel-EjXlN{\u0153baseClasses\u0153:[\u0153object\u0153,\u0153Text\u0153,\u0153str\u0153],\u0153dataType\u0153:\u0153OpenAIModel\u0153,\u0153id\u0153:\u0153OpenAIModel-EjXlN\u0153}-ChatOutput-Q39I8{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153ChatOutput-Q39I8\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "ChatOutput-Q39I8",
- "inputTypes": [
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "object",
- "Text",
- "str"
- ],
- "dataType": "OpenAIModel",
- "id": "OpenAIModel-EjXlN"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "selected": false
+ "description": "A generic file loader.",
+ "icon": "file-text",
+ "base_classes": ["Record"],
+ "display_name": "File",
+ "documentation": "",
+ "custom_fields": {
+ "path": null,
+ "silent_errors": null
},
- {
- "source": "File-t0a6a",
- "target": "RecursiveCharacterTextSplitter-tR9QM",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153File\u0153,\u0153id\u0153:\u0153File-t0a6a\u0153}",
- "targetHandle": "{\u0153fieldName\u0153:\u0153inputs\u0153,\u0153id\u0153:\u0153RecursiveCharacterTextSplitter-tR9QM\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153Record\u0153],\u0153type\u0153:\u0153Document\u0153}",
- "id": "reactflow__edge-File-t0a6a{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153File\u0153,\u0153id\u0153:\u0153File-t0a6a\u0153}-RecursiveCharacterTextSplitter-tR9QM{\u0153fieldName\u0153:\u0153inputs\u0153,\u0153id\u0153:\u0153RecursiveCharacterTextSplitter-tR9QM\u0153,\u0153inputTypes\u0153:[\u0153Document\u0153,\u0153Record\u0153],\u0153type\u0153:\u0153Document\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "inputs",
- "id": "RecursiveCharacterTextSplitter-tR9QM",
- "inputTypes": [
- "Document",
- "Record"
- ],
- "type": "Document"
- },
- "sourceHandle": {
- "baseClasses": [
- "Record"
- ],
- "dataType": "File",
- "id": "File-t0a6a"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "selected": false
+ "output_types": ["Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "File-t0a6a"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 281,
+ "positionAbsolute": {
+ "x": 2257.233450682836,
+ "y": 1747.5389618367233
+ },
+ "dragging": false
+ },
+ {
+ "id": "RecursiveCharacterTextSplitter-tR9QM",
+ "type": "genericNode",
+ "position": {
+ "x": 2791.013514133929,
+ "y": 1462.9588953494142
+ },
+ "data": {
+ "type": "RecursiveCharacterTextSplitter",
+ "node": {
+ "template": {
+ "inputs": {
+ "type": "Document",
+ "required": true,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "inputs",
+ "display_name": "Input",
+ "advanced": false,
+ "input_types": ["Document", "Record"],
+ "dynamic": false,
+ "info": "The texts to split.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "chunk_overlap": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 200,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "chunk_overlap",
+ "display_name": "Chunk Overlap",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The amount of overlap between chunks.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "chunk_size": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 1000,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "chunk_size",
+ "display_name": "Chunk Size",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The maximum length of each chunk.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Optional\n\nfrom langchain_core.documents import Document\nfrom langchain_text_splitters import RecursiveCharacterTextSplitter\n\nfrom langflow.custom import CustomComponent\nfrom langflow.schema import Record\nfrom langflow.utils.util import build_loader_repr_from_records, unescape_string\n\n\nclass RecursiveCharacterTextSplitterComponent(CustomComponent):\n display_name: str = \"Recursive Character Text Splitter\"\n description: str = \"Split text into chunks of a specified length.\"\n documentation: str = \"https://docs.langflow.org/components/text-splitters#recursivecharactertextsplitter\"\n\n def build_config(self):\n return {\n \"inputs\": {\n \"display_name\": \"Input\",\n \"info\": \"The texts to split.\",\n \"input_types\": [\"Document\", \"Record\"],\n },\n \"separators\": {\n \"display_name\": \"Separators\",\n \"info\": 'The characters to split on.\\nIf left empty defaults to [\"\\\\n\\\\n\", \"\\\\n\", \" \", \"\"].',\n \"is_list\": True,\n },\n \"chunk_size\": {\n \"display_name\": \"Chunk Size\",\n \"info\": \"The maximum length of each chunk.\",\n \"field_type\": \"int\",\n \"value\": 1000,\n },\n \"chunk_overlap\": {\n \"display_name\": \"Chunk Overlap\",\n \"info\": \"The amount of overlap between chunks.\",\n \"field_type\": \"int\",\n \"value\": 200,\n },\n \"code\": {\"show\": False},\n }\n\n def build(\n self,\n inputs: list[Document],\n separators: Optional[list[str]] = None,\n chunk_size: Optional[int] = 1000,\n chunk_overlap: Optional[int] = 200,\n ) -> list[Record]:\n \"\"\"\n Split text into chunks of a specified length.\n\n Args:\n separators (list[str]): The characters to split on.\n chunk_size (int): The maximum length of each chunk.\n chunk_overlap (int): The amount of overlap between chunks.\n length_function (function): The function to use to calculate the length of the text.\n\n Returns:\n list[str]: The chunks of text.\n \"\"\"\n\n if separators == \"\":\n separators = None\n elif separators:\n # check if the separators list has escaped characters\n # if there are escaped characters, unescape them\n separators = [unescape_string(x) for x in separators]\n\n # Make sure chunk_size and chunk_overlap are ints\n if isinstance(chunk_size, str):\n chunk_size = int(chunk_size)\n if isinstance(chunk_overlap, str):\n chunk_overlap = int(chunk_overlap)\n splitter = RecursiveCharacterTextSplitter(\n separators=separators,\n chunk_size=chunk_size,\n chunk_overlap=chunk_overlap,\n )\n documents = []\n for _input in inputs:\n if isinstance(_input, Record):\n documents.append(_input.to_lc_document())\n else:\n documents.append(_input)\n docs = splitter.split_documents(documents)\n records = self.to_records(docs)\n self.repr_value = build_loader_repr_from_records(records)\n return records\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "separators": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "separators",
+ "display_name": "Separators",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The characters to split on.\nIf left empty defaults to [\"\\n\\n\", \"\\n\", \" \", \"\"].",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": [""]
+ },
+ "_type": "CustomComponent"
},
- {
- "source": "OpenAIEmbeddings-ZlOk1",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Embeddings\u0153],\u0153dataType\u0153:\u0153OpenAIEmbeddings\u0153,\u0153id\u0153:\u0153OpenAIEmbeddings-ZlOk1\u0153}",
- "target": "AstraDBSearch-41nRz",
- "targetHandle": "{\u0153fieldName\u0153:\u0153embedding\u0153,\u0153id\u0153:\u0153AstraDBSearch-41nRz\u0153,\u0153inputTypes\u0153:null,\u0153type\u0153:\u0153Embeddings\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "embedding",
- "id": "AstraDBSearch-41nRz",
- "inputTypes": null,
- "type": "Embeddings"
- },
- "sourceHandle": {
- "baseClasses": [
- "Embeddings"
- ],
- "dataType": "OpenAIEmbeddings",
- "id": "OpenAIEmbeddings-ZlOk1"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-OpenAIEmbeddings-ZlOk1{\u0153baseClasses\u0153:[\u0153Embeddings\u0153],\u0153dataType\u0153:\u0153OpenAIEmbeddings\u0153,\u0153id\u0153:\u0153OpenAIEmbeddings-ZlOk1\u0153}-AstraDBSearch-41nRz{\u0153fieldName\u0153:\u0153embedding\u0153,\u0153id\u0153:\u0153AstraDBSearch-41nRz\u0153,\u0153inputTypes\u0153:null,\u0153type\u0153:\u0153Embeddings\u0153}"
+ "description": "Split text into chunks of a specified length.",
+ "base_classes": ["Record"],
+ "display_name": "Recursive Character Text Splitter",
+ "documentation": "https://docs.langflow.org/components/text-splitters#recursivecharactertextsplitter",
+ "custom_fields": {
+ "inputs": null,
+ "separators": null,
+ "chunk_size": null,
+ "chunk_overlap": null
},
- {
- "source": "ChatInput-yxMKE",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Text\u0153,\u0153str\u0153,\u0153object\u0153,\u0153Record\u0153],\u0153dataType\u0153:\u0153ChatInput\u0153,\u0153id\u0153:\u0153ChatInput-yxMKE\u0153}",
- "target": "AstraDBSearch-41nRz",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153AstraDBSearch-41nRz\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "AstraDBSearch-41nRz",
- "inputTypes": [
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "Text",
- "str",
- "object",
- "Record"
- ],
- "dataType": "ChatInput",
- "id": "ChatInput-yxMKE"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-ChatInput-yxMKE{\u0153baseClasses\u0153:[\u0153Text\u0153,\u0153str\u0153,\u0153object\u0153,\u0153Record\u0153],\u0153dataType\u0153:\u0153ChatInput\u0153,\u0153id\u0153:\u0153ChatInput-yxMKE\u0153}-AstraDBSearch-41nRz{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153AstraDBSearch-41nRz\u0153,\u0153inputTypes\u0153:[\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
+ "output_types": ["Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "RecursiveCharacterTextSplitter-tR9QM"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 501,
+ "positionAbsolute": {
+ "x": 2791.013514133929,
+ "y": 1462.9588953494142
+ },
+ "dragging": false
+ },
+ {
+ "id": "AstraDBSearch-41nRz",
+ "type": "genericNode",
+ "position": {
+ "x": 1723.976434815103,
+ "y": 277.03317407245913
+ },
+ "data": {
+ "type": "AstraDBSearch",
+ "node": {
+ "template": {
+ "embedding": {
+ "type": "Embeddings",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "embedding",
+ "display_name": "Embedding",
+ "advanced": false,
+ "dynamic": false,
+ "info": "Embedding to use",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "input_value": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "input_value",
+ "display_name": "Input Value",
+ "advanced": false,
+ "dynamic": false,
+ "info": "Input value to search",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "api_endpoint": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "api_endpoint",
+ "display_name": "API Endpoint",
+ "advanced": false,
+ "dynamic": false,
+ "info": "API endpoint URL for the Astra DB service.",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "ASTRA_DB_API_ENDPOINT"
+ },
+ "batch_size": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "batch_size",
+ "display_name": "Batch Size",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional number of records to process in a single batch.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "bulk_delete_concurrency": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "bulk_delete_concurrency",
+ "display_name": "Bulk Delete Concurrency",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional concurrency level for bulk delete operations.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "bulk_insert_batch_concurrency": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "bulk_insert_batch_concurrency",
+ "display_name": "Bulk Insert Batch Concurrency",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional concurrency level for bulk insert operations.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "bulk_insert_overwrite_concurrency": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "bulk_insert_overwrite_concurrency",
+ "display_name": "Bulk Insert Overwrite Concurrency",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional concurrency level for bulk insert operations that overwrite existing records.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import List, Optional\n\nfrom langflow.components.vectorstores.AstraDB import AstraDBVectorStoreComponent\nfrom langflow.components.vectorstores.base.model import LCVectorStoreComponent\nfrom langflow.field_typing import Embeddings, Text\nfrom langflow.schema import Record\n\n\nclass AstraDBSearchComponent(LCVectorStoreComponent):\n display_name = \"Astra DB Search\"\n description = \"Searches an existing Astra DB Vector Store.\"\n icon = \"AstraDB\"\n field_order = [\"token\", \"api_endpoint\", \"collection_name\", \"input_value\", \"embedding\"]\n\n def build_config(self):\n return {\n \"search_type\": {\n \"display_name\": \"Search Type\",\n \"options\": [\"Similarity\", \"MMR\"],\n },\n \"input_value\": {\n \"display_name\": \"Input Value\",\n \"info\": \"Input value to search\",\n },\n \"embedding\": {\"display_name\": \"Embedding\", \"info\": \"Embedding to use\"},\n \"collection_name\": {\n \"display_name\": \"Collection Name\",\n \"info\": \"The name of the collection within Astra DB where the vectors will be stored.\",\n },\n \"token\": {\n \"display_name\": \"Token\",\n \"info\": \"Authentication token for accessing Astra DB.\",\n \"password\": True,\n },\n \"api_endpoint\": {\n \"display_name\": \"API Endpoint\",\n \"info\": \"API endpoint URL for the Astra DB service.\",\n },\n \"namespace\": {\n \"display_name\": \"Namespace\",\n \"info\": \"Optional namespace within Astra DB to use for the collection.\",\n \"advanced\": True,\n },\n \"metric\": {\n \"display_name\": \"Metric\",\n \"info\": \"Optional distance metric for vector comparisons in the vector store.\",\n \"advanced\": True,\n },\n \"batch_size\": {\n \"display_name\": \"Batch Size\",\n \"info\": \"Optional number of records to process in a single batch.\",\n \"advanced\": True,\n },\n \"bulk_insert_batch_concurrency\": {\n \"display_name\": \"Bulk Insert Batch Concurrency\",\n \"info\": \"Optional concurrency level for bulk insert operations.\",\n \"advanced\": True,\n },\n \"bulk_insert_overwrite_concurrency\": {\n \"display_name\": \"Bulk Insert Overwrite Concurrency\",\n \"info\": \"Optional concurrency level for bulk insert operations that overwrite existing records.\",\n \"advanced\": True,\n },\n \"bulk_delete_concurrency\": {\n \"display_name\": \"Bulk Delete Concurrency\",\n \"info\": \"Optional concurrency level for bulk delete operations.\",\n \"advanced\": True,\n },\n \"setup_mode\": {\n \"display_name\": \"Setup Mode\",\n \"info\": \"Configuration mode for setting up the vector store, with options like “Sync”, “Async”, or “Off”.\",\n \"options\": [\"Sync\", \"Async\", \"Off\"],\n \"advanced\": True,\n },\n \"pre_delete_collection\": {\n \"display_name\": \"Pre Delete Collection\",\n \"info\": \"Boolean flag to determine whether to delete the collection before creating a new one.\",\n \"advanced\": True,\n },\n \"metadata_indexing_include\": {\n \"display_name\": \"Metadata Indexing Include\",\n \"info\": \"Optional list of metadata fields to include in the indexing.\",\n \"advanced\": True,\n },\n \"metadata_indexing_exclude\": {\n \"display_name\": \"Metadata Indexing Exclude\",\n \"info\": \"Optional list of metadata fields to exclude from the indexing.\",\n \"advanced\": True,\n },\n \"collection_indexing_policy\": {\n \"display_name\": \"Collection Indexing Policy\",\n \"info\": \"Optional dictionary defining the indexing policy for the collection.\",\n \"advanced\": True,\n },\n \"number_of_results\": {\n \"display_name\": \"Number of Results\",\n \"info\": \"Number of results to return.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n embedding: Embeddings,\n collection_name: str,\n input_value: Text,\n token: str,\n api_endpoint: str,\n search_type: str = \"Similarity\",\n number_of_results: int = 4,\n namespace: Optional[str] = None,\n metric: Optional[str] = None,\n batch_size: Optional[int] = None,\n bulk_insert_batch_concurrency: Optional[int] = None,\n bulk_insert_overwrite_concurrency: Optional[int] = None,\n bulk_delete_concurrency: Optional[int] = None,\n setup_mode: str = \"Sync\",\n pre_delete_collection: bool = False,\n metadata_indexing_include: Optional[List[str]] = None,\n metadata_indexing_exclude: Optional[List[str]] = None,\n collection_indexing_policy: Optional[dict] = None,\n ) -> List[Record]:\n vector_store = AstraDBVectorStoreComponent().build(\n embedding=embedding,\n collection_name=collection_name,\n token=token,\n api_endpoint=api_endpoint,\n namespace=namespace,\n metric=metric,\n batch_size=batch_size,\n bulk_insert_batch_concurrency=bulk_insert_batch_concurrency,\n bulk_insert_overwrite_concurrency=bulk_insert_overwrite_concurrency,\n bulk_delete_concurrency=bulk_delete_concurrency,\n setup_mode=setup_mode,\n pre_delete_collection=pre_delete_collection,\n metadata_indexing_include=metadata_indexing_include,\n metadata_indexing_exclude=metadata_indexing_exclude,\n collection_indexing_policy=collection_indexing_policy,\n )\n try:\n return self.search_with_vector_store(input_value, search_type, vector_store, k=number_of_results)\n except KeyError as e:\n if \"content\" in str(e):\n raise ValueError(\n \"You should ingest data through Langflow (or LangChain) to query it in Langflow. Your collection does not contain a field name 'content'.\"\n )\n else:\n raise e\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "collection_indexing_policy": {
+ "type": "dict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "collection_indexing_policy",
+ "display_name": "Collection Indexing Policy",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional dictionary defining the indexing policy for the collection.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "collection_name": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "collection_name",
+ "display_name": "Collection Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The name of the collection within Astra DB where the vectors will be stored.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "langflow"
+ },
+ "metadata_indexing_exclude": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "metadata_indexing_exclude",
+ "display_name": "Metadata Indexing Exclude",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional list of metadata fields to exclude from the indexing.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "metadata_indexing_include": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "metadata_indexing_include",
+ "display_name": "Metadata Indexing Include",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional list of metadata fields to include in the indexing.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "metric": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "metric",
+ "display_name": "Metric",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional distance metric for vector comparisons in the vector store.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "namespace": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "namespace",
+ "display_name": "Namespace",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional namespace within Astra DB to use for the collection.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "number_of_results": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 4,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "number_of_results",
+ "display_name": "Number of Results",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Number of results to return.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "pre_delete_collection": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "pre_delete_collection",
+ "display_name": "Pre Delete Collection",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Boolean flag to determine whether to delete the collection before creating a new one.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "search_type": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "Similarity",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["Similarity", "MMR"],
+ "name": "search_type",
+ "display_name": "Search Type",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "setup_mode": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "Sync",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["Sync", "Async", "Off"],
+ "name": "setup_mode",
+ "display_name": "Setup Mode",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Configuration mode for setting up the vector store, with options like “Sync”, “Async”, or “Off”.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "token": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "token",
+ "display_name": "Token",
+ "advanced": false,
+ "dynamic": false,
+ "info": "Authentication token for accessing Astra DB.",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "ASTRA_DB_APPLICATION_TOKEN"
+ },
+ "_type": "CustomComponent"
},
- {
- "source": "RecursiveCharacterTextSplitter-tR9QM",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153RecursiveCharacterTextSplitter\u0153,\u0153id\u0153:\u0153RecursiveCharacterTextSplitter-tR9QM\u0153}",
- "target": "AstraDB-eUCSS",
- "targetHandle": "{\u0153fieldName\u0153:\u0153inputs\u0153,\u0153id\u0153:\u0153AstraDB-eUCSS\u0153,\u0153inputTypes\u0153:null,\u0153type\u0153:\u0153Record\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "inputs",
- "id": "AstraDB-eUCSS",
- "inputTypes": null,
- "type": "Record"
- },
- "sourceHandle": {
- "baseClasses": [
- "Record"
- ],
- "dataType": "RecursiveCharacterTextSplitter",
- "id": "RecursiveCharacterTextSplitter-tR9QM"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-RecursiveCharacterTextSplitter-tR9QM{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153RecursiveCharacterTextSplitter\u0153,\u0153id\u0153:\u0153RecursiveCharacterTextSplitter-tR9QM\u0153}-AstraDB-eUCSS{\u0153fieldName\u0153:\u0153inputs\u0153,\u0153id\u0153:\u0153AstraDB-eUCSS\u0153,\u0153inputTypes\u0153:null,\u0153type\u0153:\u0153Record\u0153}",
- "selected": false
+ "description": "Searches an existing Astra DB Vector Store.",
+ "icon": "AstraDB",
+ "base_classes": ["Record"],
+ "display_name": "Astra DB Search",
+ "documentation": "",
+ "custom_fields": {
+ "embedding": null,
+ "collection_name": null,
+ "input_value": null,
+ "token": null,
+ "api_endpoint": null,
+ "search_type": null,
+ "number_of_results": null,
+ "namespace": null,
+ "metric": null,
+ "batch_size": null,
+ "bulk_insert_batch_concurrency": null,
+ "bulk_insert_overwrite_concurrency": null,
+ "bulk_delete_concurrency": null,
+ "setup_mode": null,
+ "pre_delete_collection": null,
+ "metadata_indexing_include": null,
+ "metadata_indexing_exclude": null,
+ "collection_indexing_policy": null
},
- {
- "source": "OpenAIEmbeddings-9TPjc",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Embeddings\u0153],\u0153dataType\u0153:\u0153OpenAIEmbeddings\u0153,\u0153id\u0153:\u0153OpenAIEmbeddings-9TPjc\u0153}",
- "target": "AstraDB-eUCSS",
- "targetHandle": "{\u0153fieldName\u0153:\u0153embedding\u0153,\u0153id\u0153:\u0153AstraDB-eUCSS\u0153,\u0153inputTypes\u0153:null,\u0153type\u0153:\u0153Embeddings\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "embedding",
- "id": "AstraDB-eUCSS",
- "inputTypes": null,
- "type": "Embeddings"
- },
- "sourceHandle": {
- "baseClasses": [
- "Embeddings"
- ],
- "dataType": "OpenAIEmbeddings",
- "id": "OpenAIEmbeddings-9TPjc"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-OpenAIEmbeddings-9TPjc{\u0153baseClasses\u0153:[\u0153Embeddings\u0153],\u0153dataType\u0153:\u0153OpenAIEmbeddings\u0153,\u0153id\u0153:\u0153OpenAIEmbeddings-9TPjc\u0153}-AstraDB-eUCSS{\u0153fieldName\u0153:\u0153embedding\u0153,\u0153id\u0153:\u0153AstraDB-eUCSS\u0153,\u0153inputTypes\u0153:null,\u0153type\u0153:\u0153Embeddings\u0153}",
- "selected": false
- },
- {
- "source": "AstraDBSearch-41nRz",
- "sourceHandle": "{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153AstraDBSearch\u0153,\u0153id\u0153:\u0153AstraDBSearch-41nRz\u0153}",
- "target": "TextOutput-BDknO",
- "targetHandle": "{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153TextOutput-BDknO\u0153,\u0153inputTypes\u0153:[\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}",
- "data": {
- "targetHandle": {
- "fieldName": "input_value",
- "id": "TextOutput-BDknO",
- "inputTypes": [
- "Record",
- "Text"
- ],
- "type": "str"
- },
- "sourceHandle": {
- "baseClasses": [
- "Record"
- ],
- "dataType": "AstraDBSearch",
- "id": "AstraDBSearch-41nRz"
- }
- },
- "style": {
- "stroke": "#555"
- },
- "className": "stroke-gray-900 stroke-connection",
- "id": "reactflow__edge-AstraDBSearch-41nRz{\u0153baseClasses\u0153:[\u0153Record\u0153],\u0153dataType\u0153:\u0153AstraDBSearch\u0153,\u0153id\u0153:\u0153AstraDBSearch-41nRz\u0153}-TextOutput-BDknO{\u0153fieldName\u0153:\u0153input_value\u0153,\u0153id\u0153:\u0153TextOutput-BDknO\u0153,\u0153inputTypes\u0153:[\u0153Record\u0153,\u0153Text\u0153],\u0153type\u0153:\u0153str\u0153}"
- }
- ],
- "viewport": {
- "x": -259.6782520315529,
- "y": 90.3428735006047,
- "zoom": 0.2687057134854984
+ "output_types": ["Record"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [
+ "token",
+ "api_endpoint",
+ "collection_name",
+ "input_value",
+ "embedding"
+ ],
+ "beta": false
+ },
+ "id": "AstraDBSearch-41nRz"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 713,
+ "dragging": false,
+ "positionAbsolute": {
+ "x": 1723.976434815103,
+ "y": 277.03317407245913
}
- },
- "description": "Visit https://pre-release.langflow.org/tutorials/rag-with-astradb for a detailed guide of this project.\nThis project give you both Ingestion and RAG in a single file. You'll need to visit https://astra.datastax.com/ to create an Astra DB instance, your Token and grab an API Endpoint.\nRunning this project requires you to add a file in the Files component, then define a Collection Name and click on the Play icon on the Astra DB component. \n\nAfter the ingestion ends you are ready to click on the Run button at the lower left corner and start asking questions about your data.",
- "name": "Vector Store RAG",
- "last_tested_version": "1.0.0a0",
- "is_component": false
+ },
+ {
+ "id": "AstraDB-eUCSS",
+ "type": "genericNode",
+ "position": {
+ "x": 3372.04958055989,
+ "y": 1611.0742035495277
+ },
+ "data": {
+ "type": "AstraDB",
+ "node": {
+ "template": {
+ "embedding": {
+ "type": "Embeddings",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "embedding",
+ "display_name": "Embedding",
+ "advanced": false,
+ "dynamic": false,
+ "info": "Embedding to use",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "inputs": {
+ "type": "Record",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "inputs",
+ "display_name": "Inputs",
+ "advanced": false,
+ "dynamic": false,
+ "info": "Optional list of records to be processed and stored in the vector store.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "api_endpoint": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "api_endpoint",
+ "display_name": "API Endpoint",
+ "advanced": false,
+ "dynamic": false,
+ "info": "API endpoint URL for the Astra DB service.",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "ASTRA_DB_API_ENDPOINT"
+ },
+ "batch_size": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "batch_size",
+ "display_name": "Batch Size",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional number of records to process in a single batch.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "bulk_delete_concurrency": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "bulk_delete_concurrency",
+ "display_name": "Bulk Delete Concurrency",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional concurrency level for bulk delete operations.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "bulk_insert_batch_concurrency": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "bulk_insert_batch_concurrency",
+ "display_name": "Bulk Insert Batch Concurrency",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional concurrency level for bulk insert operations.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "bulk_insert_overwrite_concurrency": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "bulk_insert_overwrite_concurrency",
+ "display_name": "Bulk Insert Overwrite Concurrency",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional concurrency level for bulk insert operations that overwrite existing records.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import List, Optional, Union\nfrom langchain_astradb import AstraDBVectorStore\nfrom langchain_astradb.utils.astradb import SetupMode\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Embeddings, VectorStore\nfrom langflow.schema import Record\nfrom langchain_core.retrievers import BaseRetriever\n\n\nclass AstraDBVectorStoreComponent(CustomComponent):\n display_name = \"Astra DB\"\n description = \"Builds or loads an Astra DB Vector Store.\"\n icon = \"AstraDB\"\n field_order = [\"token\", \"api_endpoint\", \"collection_name\", \"inputs\", \"embedding\"]\n\n def build_config(self):\n return {\n \"inputs\": {\n \"display_name\": \"Inputs\",\n \"info\": \"Optional list of records to be processed and stored in the vector store.\",\n },\n \"embedding\": {\"display_name\": \"Embedding\", \"info\": \"Embedding to use\"},\n \"collection_name\": {\n \"display_name\": \"Collection Name\",\n \"info\": \"The name of the collection within Astra DB where the vectors will be stored.\",\n },\n \"token\": {\n \"display_name\": \"Token\",\n \"info\": \"Authentication token for accessing Astra DB.\",\n \"password\": True,\n },\n \"api_endpoint\": {\n \"display_name\": \"API Endpoint\",\n \"info\": \"API endpoint URL for the Astra DB service.\",\n },\n \"namespace\": {\n \"display_name\": \"Namespace\",\n \"info\": \"Optional namespace within Astra DB to use for the collection.\",\n \"advanced\": True,\n },\n \"metric\": {\n \"display_name\": \"Metric\",\n \"info\": \"Optional distance metric for vector comparisons in the vector store.\",\n \"advanced\": True,\n },\n \"batch_size\": {\n \"display_name\": \"Batch Size\",\n \"info\": \"Optional number of records to process in a single batch.\",\n \"advanced\": True,\n },\n \"bulk_insert_batch_concurrency\": {\n \"display_name\": \"Bulk Insert Batch Concurrency\",\n \"info\": \"Optional concurrency level for bulk insert operations.\",\n \"advanced\": True,\n },\n \"bulk_insert_overwrite_concurrency\": {\n \"display_name\": \"Bulk Insert Overwrite Concurrency\",\n \"info\": \"Optional concurrency level for bulk insert operations that overwrite existing records.\",\n \"advanced\": True,\n },\n \"bulk_delete_concurrency\": {\n \"display_name\": \"Bulk Delete Concurrency\",\n \"info\": \"Optional concurrency level for bulk delete operations.\",\n \"advanced\": True,\n },\n \"setup_mode\": {\n \"display_name\": \"Setup Mode\",\n \"info\": \"Configuration mode for setting up the vector store, with options like “Sync”, “Async”, or “Off”.\",\n \"options\": [\"Sync\", \"Async\", \"Off\"],\n \"advanced\": True,\n },\n \"pre_delete_collection\": {\n \"display_name\": \"Pre Delete Collection\",\n \"info\": \"Boolean flag to determine whether to delete the collection before creating a new one.\",\n \"advanced\": True,\n },\n \"metadata_indexing_include\": {\n \"display_name\": \"Metadata Indexing Include\",\n \"info\": \"Optional list of metadata fields to include in the indexing.\",\n \"advanced\": True,\n },\n \"metadata_indexing_exclude\": {\n \"display_name\": \"Metadata Indexing Exclude\",\n \"info\": \"Optional list of metadata fields to exclude from the indexing.\",\n \"advanced\": True,\n },\n \"collection_indexing_policy\": {\n \"display_name\": \"Collection Indexing Policy\",\n \"info\": \"Optional dictionary defining the indexing policy for the collection.\",\n \"advanced\": True,\n },\n }\n\n def build(\n self,\n embedding: Embeddings,\n token: str,\n api_endpoint: str,\n collection_name: str,\n inputs: Optional[List[Record]] = None,\n namespace: Optional[str] = None,\n metric: Optional[str] = None,\n batch_size: Optional[int] = None,\n bulk_insert_batch_concurrency: Optional[int] = None,\n bulk_insert_overwrite_concurrency: Optional[int] = None,\n bulk_delete_concurrency: Optional[int] = None,\n setup_mode: str = \"Sync\",\n pre_delete_collection: bool = False,\n metadata_indexing_include: Optional[List[str]] = None,\n metadata_indexing_exclude: Optional[List[str]] = None,\n collection_indexing_policy: Optional[dict] = None,\n ) -> Union[VectorStore, BaseRetriever]:\n try:\n setup_mode_value = SetupMode[setup_mode.upper()]\n except KeyError:\n raise ValueError(f\"Invalid setup mode: {setup_mode}\")\n if inputs:\n documents = [_input.to_lc_document() for _input in inputs]\n\n vector_store = AstraDBVectorStore.from_documents(\n documents=documents,\n embedding=embedding,\n collection_name=collection_name,\n token=token,\n api_endpoint=api_endpoint,\n namespace=namespace,\n metric=metric,\n batch_size=batch_size,\n bulk_insert_batch_concurrency=bulk_insert_batch_concurrency,\n bulk_insert_overwrite_concurrency=bulk_insert_overwrite_concurrency,\n bulk_delete_concurrency=bulk_delete_concurrency,\n setup_mode=setup_mode_value,\n pre_delete_collection=pre_delete_collection,\n metadata_indexing_include=metadata_indexing_include,\n metadata_indexing_exclude=metadata_indexing_exclude,\n collection_indexing_policy=collection_indexing_policy,\n )\n else:\n vector_store = AstraDBVectorStore(\n embedding=embedding,\n collection_name=collection_name,\n token=token,\n api_endpoint=api_endpoint,\n namespace=namespace,\n metric=metric,\n batch_size=batch_size,\n bulk_insert_batch_concurrency=bulk_insert_batch_concurrency,\n bulk_insert_overwrite_concurrency=bulk_insert_overwrite_concurrency,\n bulk_delete_concurrency=bulk_delete_concurrency,\n setup_mode=setup_mode_value,\n pre_delete_collection=pre_delete_collection,\n metadata_indexing_include=metadata_indexing_include,\n metadata_indexing_exclude=metadata_indexing_exclude,\n collection_indexing_policy=collection_indexing_policy,\n )\n\n return vector_store\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "collection_indexing_policy": {
+ "type": "dict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "collection_indexing_policy",
+ "display_name": "Collection Indexing Policy",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional dictionary defining the indexing policy for the collection.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "collection_name": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "collection_name",
+ "display_name": "Collection Name",
+ "advanced": false,
+ "dynamic": false,
+ "info": "The name of the collection within Astra DB where the vectors will be stored.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "langflow"
+ },
+ "metadata_indexing_exclude": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "metadata_indexing_exclude",
+ "display_name": "Metadata Indexing Exclude",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional list of metadata fields to exclude from the indexing.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "metadata_indexing_include": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "metadata_indexing_include",
+ "display_name": "Metadata Indexing Include",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional list of metadata fields to include in the indexing.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "metric": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "metric",
+ "display_name": "Metric",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional distance metric for vector comparisons in the vector store.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "namespace": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "namespace",
+ "display_name": "Namespace",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Optional namespace within Astra DB to use for the collection.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "pre_delete_collection": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "pre_delete_collection",
+ "display_name": "Pre Delete Collection",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Boolean flag to determine whether to delete the collection before creating a new one.",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "setup_mode": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "Sync",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": ["Sync", "Async", "Off"],
+ "name": "setup_mode",
+ "display_name": "Setup Mode",
+ "advanced": true,
+ "dynamic": false,
+ "info": "Configuration mode for setting up the vector store, with options like “Sync”, “Async”, or “Off”.",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "token": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "token",
+ "display_name": "Token",
+ "advanced": false,
+ "dynamic": false,
+ "info": "Authentication token for accessing Astra DB.",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "ASTRA_DB_APPLICATION_TOKEN"
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Builds or loads an Astra DB Vector Store.",
+ "icon": "AstraDB",
+ "base_classes": ["VectorStore"],
+ "display_name": "Astra DB",
+ "documentation": "",
+ "custom_fields": {
+ "embedding": null,
+ "token": null,
+ "api_endpoint": null,
+ "collection_name": null,
+ "inputs": null,
+ "namespace": null,
+ "metric": null,
+ "batch_size": null,
+ "bulk_insert_batch_concurrency": null,
+ "bulk_insert_overwrite_concurrency": null,
+ "bulk_delete_concurrency": null,
+ "setup_mode": null,
+ "pre_delete_collection": null,
+ "metadata_indexing_include": null,
+ "metadata_indexing_exclude": null,
+ "collection_indexing_policy": null
+ },
+ "output_types": ["VectorStore"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [
+ "token",
+ "api_endpoint",
+ "collection_name",
+ "inputs",
+ "embedding"
+ ],
+ "beta": false
+ },
+ "id": "AstraDB-eUCSS"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 573,
+ "positionAbsolute": {
+ "x": 3372.04958055989,
+ "y": 1611.0742035495277
+ },
+ "dragging": false
+ },
+ {
+ "id": "OpenAIEmbeddings-9TPjc",
+ "type": "genericNode",
+ "position": {
+ "x": 2814.0402191223047,
+ "y": 1955.9268168273086
+ },
+ "data": {
+ "type": "OpenAIEmbeddings",
+ "node": {
+ "template": {
+ "allowed_special": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": [],
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "allowed_special",
+ "display_name": "Allowed Special",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "chunk_size": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 1000,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "chunk_size",
+ "display_name": "Chunk Size",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "client": {
+ "type": "Any",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "client",
+ "display_name": "Client",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "code": {
+ "type": "code",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": true,
+ "value": "from typing import Dict, List, Optional\n\nfrom langchain_openai.embeddings.base import OpenAIEmbeddings\nfrom pydantic.v1 import SecretStr\n\nfrom langflow.custom import CustomComponent\nfrom langflow.field_typing import Embeddings, NestedDict\n\n\nclass OpenAIEmbeddingsComponent(CustomComponent):\n display_name = \"OpenAI Embeddings\"\n description = \"Generate embeddings using OpenAI models.\"\n\n def build_config(self):\n return {\n \"allowed_special\": {\n \"display_name\": \"Allowed Special\",\n \"advanced\": True,\n \"field_type\": \"str\",\n \"is_list\": True,\n },\n \"default_headers\": {\n \"display_name\": \"Default Headers\",\n \"advanced\": True,\n \"field_type\": \"dict\",\n },\n \"default_query\": {\n \"display_name\": \"Default Query\",\n \"advanced\": True,\n \"field_type\": \"NestedDict\",\n },\n \"disallowed_special\": {\n \"display_name\": \"Disallowed Special\",\n \"advanced\": True,\n \"field_type\": \"str\",\n \"is_list\": True,\n },\n \"chunk_size\": {\"display_name\": \"Chunk Size\", \"advanced\": True},\n \"client\": {\"display_name\": \"Client\", \"advanced\": True},\n \"deployment\": {\"display_name\": \"Deployment\", \"advanced\": True},\n \"embedding_ctx_length\": {\n \"display_name\": \"Embedding Context Length\",\n \"advanced\": True,\n },\n \"max_retries\": {\"display_name\": \"Max Retries\", \"advanced\": True},\n \"model\": {\n \"display_name\": \"Model\",\n \"advanced\": False,\n \"options\": [\n \"text-embedding-3-small\",\n \"text-embedding-3-large\",\n \"text-embedding-ada-002\",\n ],\n },\n \"model_kwargs\": {\"display_name\": \"Model Kwargs\", \"advanced\": True},\n \"openai_api_base\": {\n \"display_name\": \"OpenAI API Base\",\n \"password\": True,\n \"advanced\": True,\n },\n \"openai_api_key\": {\"display_name\": \"OpenAI API Key\", \"password\": True},\n \"openai_api_type\": {\n \"display_name\": \"OpenAI API Type\",\n \"advanced\": True,\n \"password\": True,\n },\n \"openai_api_version\": {\n \"display_name\": \"OpenAI API Version\",\n \"advanced\": True,\n },\n \"openai_organization\": {\n \"display_name\": \"OpenAI Organization\",\n \"advanced\": True,\n },\n \"openai_proxy\": {\"display_name\": \"OpenAI Proxy\", \"advanced\": True},\n \"request_timeout\": {\"display_name\": \"Request Timeout\", \"advanced\": True},\n \"show_progress_bar\": {\n \"display_name\": \"Show Progress Bar\",\n \"advanced\": True,\n },\n \"skip_empty\": {\"display_name\": \"Skip Empty\", \"advanced\": True},\n \"tiktoken_model_name\": {\n \"display_name\": \"TikToken Model Name\",\n \"advanced\": True,\n },\n \"tiktoken_enable\": {\"display_name\": \"TikToken Enable\", \"advanced\": True},\n }\n\n def build(\n self,\n openai_api_key: str,\n default_headers: Optional[Dict[str, str]] = None,\n default_query: Optional[NestedDict] = {},\n allowed_special: List[str] = [],\n disallowed_special: List[str] = [\"all\"],\n chunk_size: int = 1000,\n deployment: str = \"text-embedding-ada-002\",\n embedding_ctx_length: int = 8191,\n max_retries: int = 6,\n model: str = \"text-embedding-ada-002\",\n model_kwargs: NestedDict = {},\n openai_api_base: Optional[str] = None,\n openai_api_type: Optional[str] = None,\n openai_api_version: Optional[str] = None,\n openai_organization: Optional[str] = None,\n openai_proxy: Optional[str] = None,\n request_timeout: Optional[float] = None,\n show_progress_bar: bool = False,\n skip_empty: bool = False,\n tiktoken_enable: bool = True,\n tiktoken_model_name: Optional[str] = None,\n ) -> Embeddings:\n # This is to avoid errors with Vector Stores (e.g Chroma)\n if disallowed_special == [\"all\"]:\n disallowed_special = \"all\" # type: ignore\n if openai_api_key:\n api_key = SecretStr(openai_api_key)\n else:\n api_key = None\n\n return OpenAIEmbeddings(\n tiktoken_enabled=tiktoken_enable,\n default_headers=default_headers,\n default_query=default_query,\n allowed_special=set(allowed_special),\n disallowed_special=\"all\",\n chunk_size=chunk_size,\n deployment=deployment,\n embedding_ctx_length=embedding_ctx_length,\n max_retries=max_retries,\n model=model,\n model_kwargs=model_kwargs,\n base_url=openai_api_base,\n api_key=api_key,\n openai_api_type=openai_api_type,\n api_version=openai_api_version,\n organization=openai_organization,\n openai_proxy=openai_proxy,\n timeout=request_timeout,\n show_progress_bar=show_progress_bar,\n skip_empty=skip_empty,\n tiktoken_model_name=tiktoken_model_name,\n )\n",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "code",
+ "advanced": true,
+ "dynamic": true,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "default_headers": {
+ "type": "dict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "default_headers",
+ "display_name": "Default Headers",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "default_query": {
+ "type": "NestedDict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": {},
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "default_query",
+ "display_name": "Default Query",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "deployment": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": "text-embedding-ada-002",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "deployment",
+ "display_name": "Deployment",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "disallowed_special": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": ["all"],
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "disallowed_special",
+ "display_name": "Disallowed Special",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "embedding_ctx_length": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 8191,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "embedding_ctx_length",
+ "display_name": "Embedding Context Length",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "max_retries": {
+ "type": "int",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": 6,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "max_retries",
+ "display_name": "Max Retries",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "model": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": true,
+ "show": true,
+ "multiline": false,
+ "value": "text-embedding-ada-002",
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "options": [
+ "text-embedding-3-small",
+ "text-embedding-3-large",
+ "text-embedding-ada-002"
+ ],
+ "name": "model",
+ "display_name": "Model",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "model_kwargs": {
+ "type": "NestedDict",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": {},
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "model_kwargs",
+ "display_name": "Model Kwargs",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "openai_api_base": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "openai_api_base",
+ "display_name": "OpenAI API Base",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_key": {
+ "type": "str",
+ "required": true,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "openai_api_key",
+ "display_name": "OpenAI API Key",
+ "advanced": false,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": true,
+ "title_case": false,
+ "input_types": ["Text"],
+ "value": "OPENAI_API_KEY"
+ },
+ "openai_api_type": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": true,
+ "name": "openai_api_type",
+ "display_name": "OpenAI API Type",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_api_version": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "openai_api_version",
+ "display_name": "OpenAI API Version",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_organization": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "openai_organization",
+ "display_name": "OpenAI Organization",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "openai_proxy": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "openai_proxy",
+ "display_name": "OpenAI Proxy",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "request_timeout": {
+ "type": "float",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "request_timeout",
+ "display_name": "Request Timeout",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "rangeSpec": {
+ "step_type": "float",
+ "min": -1,
+ "max": 1,
+ "step": 0.1
+ },
+ "load_from_db": false,
+ "title_case": false
+ },
+ "show_progress_bar": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "show_progress_bar",
+ "display_name": "Show Progress Bar",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "skip_empty": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "skip_empty",
+ "display_name": "Skip Empty",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "tiktoken_enable": {
+ "type": "bool",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "value": true,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "tiktoken_enable",
+ "display_name": "TikToken Enable",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false
+ },
+ "tiktoken_model_name": {
+ "type": "str",
+ "required": false,
+ "placeholder": "",
+ "list": false,
+ "show": true,
+ "multiline": false,
+ "fileTypes": [],
+ "file_path": "",
+ "password": false,
+ "name": "tiktoken_model_name",
+ "display_name": "TikToken Model Name",
+ "advanced": true,
+ "dynamic": false,
+ "info": "",
+ "load_from_db": false,
+ "title_case": false,
+ "input_types": ["Text"]
+ },
+ "_type": "CustomComponent"
+ },
+ "description": "Generate embeddings using OpenAI models.",
+ "base_classes": ["Embeddings"],
+ "display_name": "OpenAI Embeddings",
+ "documentation": "",
+ "custom_fields": {
+ "openai_api_key": null,
+ "default_headers": null,
+ "default_query": null,
+ "allowed_special": null,
+ "disallowed_special": null,
+ "chunk_size": null,
+ "client": null,
+ "deployment": null,
+ "embedding_ctx_length": null,
+ "max_retries": null,
+ "model": null,
+ "model_kwargs": null,
+ "openai_api_base": null,
+ "openai_api_type": null,
+ "openai_api_version": null,
+ "openai_organization": null,
+ "openai_proxy": null,
+ "request_timeout": null,
+ "show_progress_bar": null,
+ "skip_empty": null,
+ "tiktoken_enable": null,
+ "tiktoken_model_name": null
+ },
+ "output_types": ["Embeddings"],
+ "field_formatters": {},
+ "frozen": false,
+ "field_order": [],
+ "beta": false
+ },
+ "id": "OpenAIEmbeddings-9TPjc"
+ },
+ "selected": false,
+ "width": 384,
+ "height": 383,
+ "positionAbsolute": {
+ "x": 2814.0402191223047,
+ "y": 1955.9268168273086
+ },
+ "dragging": false
+ }
+ ],
+ "edges": [
+ {
+ "source": "TextOutput-BDknO",
+ "target": "Prompt-xeI6K",
+ "sourceHandle": "{œbaseClassesœ:[œobjectœ,œTextœ,œstrœ],œdataTypeœ:œTextOutputœ,œidœ:œTextOutput-BDknOœ}",
+ "targetHandle": "{œfieldNameœ:œcontextœ,œidœ:œPrompt-xeI6Kœ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}",
+ "id": "reactflow__edge-TextOutput-BDknO{œbaseClassesœ:[œobjectœ,œTextœ,œstrœ],œdataTypeœ:œTextOutputœ,œidœ:œTextOutput-BDknOœ}-Prompt-xeI6K{œfieldNameœ:œcontextœ,œidœ:œPrompt-xeI6Kœ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "context",
+ "id": "Prompt-xeI6K",
+ "inputTypes": ["Document", "BaseOutputParser", "Record", "Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["object", "Text", "str"],
+ "dataType": "TextOutput",
+ "id": "TextOutput-BDknO"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "selected": false
+ },
+ {
+ "source": "ChatInput-yxMKE",
+ "target": "Prompt-xeI6K",
+ "sourceHandle": "{œbaseClassesœ:[œTextœ,œstrœ,œobjectœ,œRecordœ],œdataTypeœ:œChatInputœ,œidœ:œChatInput-yxMKEœ}",
+ "targetHandle": "{œfieldNameœ:œquestionœ,œidœ:œPrompt-xeI6Kœ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}",
+ "id": "reactflow__edge-ChatInput-yxMKE{œbaseClassesœ:[œTextœ,œstrœ,œobjectœ,œRecordœ],œdataTypeœ:œChatInputœ,œidœ:œChatInput-yxMKEœ}-Prompt-xeI6K{œfieldNameœ:œquestionœ,œidœ:œPrompt-xeI6Kœ,œinputTypesœ:[œDocumentœ,œBaseOutputParserœ,œRecordœ,œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "question",
+ "id": "Prompt-xeI6K",
+ "inputTypes": ["Document", "BaseOutputParser", "Record", "Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["Text", "str", "object", "Record"],
+ "dataType": "ChatInput",
+ "id": "ChatInput-yxMKE"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "selected": false
+ },
+ {
+ "source": "Prompt-xeI6K",
+ "target": "OpenAIModel-EjXlN",
+ "sourceHandle": "{œbaseClassesœ:[œobjectœ,œTextœ,œstrœ],œdataTypeœ:œPromptœ,œidœ:œPrompt-xeI6Kœ}",
+ "targetHandle": "{œfieldNameœ:œinput_valueœ,œidœ:œOpenAIModel-EjXlNœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}",
+ "id": "reactflow__edge-Prompt-xeI6K{œbaseClassesœ:[œobjectœ,œTextœ,œstrœ],œdataTypeœ:œPromptœ,œidœ:œPrompt-xeI6Kœ}-OpenAIModel-EjXlN{œfieldNameœ:œinput_valueœ,œidœ:œOpenAIModel-EjXlNœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "OpenAIModel-EjXlN",
+ "inputTypes": ["Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["object", "Text", "str"],
+ "dataType": "Prompt",
+ "id": "Prompt-xeI6K"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "selected": false
+ },
+ {
+ "source": "OpenAIModel-EjXlN",
+ "target": "ChatOutput-Q39I8",
+ "sourceHandle": "{œbaseClassesœ:[œobjectœ,œTextœ,œstrœ],œdataTypeœ:œOpenAIModelœ,œidœ:œOpenAIModel-EjXlNœ}",
+ "targetHandle": "{œfieldNameœ:œinput_valueœ,œidœ:œChatOutput-Q39I8œ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}",
+ "id": "reactflow__edge-OpenAIModel-EjXlN{œbaseClassesœ:[œobjectœ,œTextœ,œstrœ],œdataTypeœ:œOpenAIModelœ,œidœ:œOpenAIModel-EjXlNœ}-ChatOutput-Q39I8{œfieldNameœ:œinput_valueœ,œidœ:œChatOutput-Q39I8œ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "ChatOutput-Q39I8",
+ "inputTypes": ["Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["object", "Text", "str"],
+ "dataType": "OpenAIModel",
+ "id": "OpenAIModel-EjXlN"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "selected": false
+ },
+ {
+ "source": "File-t0a6a",
+ "target": "RecursiveCharacterTextSplitter-tR9QM",
+ "sourceHandle": "{œbaseClassesœ:[œRecordœ],œdataTypeœ:œFileœ,œidœ:œFile-t0a6aœ}",
+ "targetHandle": "{œfieldNameœ:œinputsœ,œidœ:œRecursiveCharacterTextSplitter-tR9QMœ,œinputTypesœ:[œDocumentœ,œRecordœ],œtypeœ:œDocumentœ}",
+ "id": "reactflow__edge-File-t0a6a{œbaseClassesœ:[œRecordœ],œdataTypeœ:œFileœ,œidœ:œFile-t0a6aœ}-RecursiveCharacterTextSplitter-tR9QM{œfieldNameœ:œinputsœ,œidœ:œRecursiveCharacterTextSplitter-tR9QMœ,œinputTypesœ:[œDocumentœ,œRecordœ],œtypeœ:œDocumentœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "inputs",
+ "id": "RecursiveCharacterTextSplitter-tR9QM",
+ "inputTypes": ["Document", "Record"],
+ "type": "Document"
+ },
+ "sourceHandle": {
+ "baseClasses": ["Record"],
+ "dataType": "File",
+ "id": "File-t0a6a"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "selected": false
+ },
+ {
+ "source": "OpenAIEmbeddings-ZlOk1",
+ "sourceHandle": "{œbaseClassesœ:[œEmbeddingsœ],œdataTypeœ:œOpenAIEmbeddingsœ,œidœ:œOpenAIEmbeddings-ZlOk1œ}",
+ "target": "AstraDBSearch-41nRz",
+ "targetHandle": "{œfieldNameœ:œembeddingœ,œidœ:œAstraDBSearch-41nRzœ,œinputTypesœ:null,œtypeœ:œEmbeddingsœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "embedding",
+ "id": "AstraDBSearch-41nRz",
+ "inputTypes": null,
+ "type": "Embeddings"
+ },
+ "sourceHandle": {
+ "baseClasses": ["Embeddings"],
+ "dataType": "OpenAIEmbeddings",
+ "id": "OpenAIEmbeddings-ZlOk1"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-OpenAIEmbeddings-ZlOk1{œbaseClassesœ:[œEmbeddingsœ],œdataTypeœ:œOpenAIEmbeddingsœ,œidœ:œOpenAIEmbeddings-ZlOk1œ}-AstraDBSearch-41nRz{œfieldNameœ:œembeddingœ,œidœ:œAstraDBSearch-41nRzœ,œinputTypesœ:null,œtypeœ:œEmbeddingsœ}"
+ },
+ {
+ "source": "ChatInput-yxMKE",
+ "sourceHandle": "{œbaseClassesœ:[œTextœ,œstrœ,œobjectœ,œRecordœ],œdataTypeœ:œChatInputœ,œidœ:œChatInput-yxMKEœ}",
+ "target": "AstraDBSearch-41nRz",
+ "targetHandle": "{œfieldNameœ:œinput_valueœ,œidœ:œAstraDBSearch-41nRzœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "AstraDBSearch-41nRz",
+ "inputTypes": ["Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["Text", "str", "object", "Record"],
+ "dataType": "ChatInput",
+ "id": "ChatInput-yxMKE"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-ChatInput-yxMKE{œbaseClassesœ:[œTextœ,œstrœ,œobjectœ,œRecordœ],œdataTypeœ:œChatInputœ,œidœ:œChatInput-yxMKEœ}-AstraDBSearch-41nRz{œfieldNameœ:œinput_valueœ,œidœ:œAstraDBSearch-41nRzœ,œinputTypesœ:[œTextœ],œtypeœ:œstrœ}"
+ },
+ {
+ "source": "RecursiveCharacterTextSplitter-tR9QM",
+ "sourceHandle": "{œbaseClassesœ:[œRecordœ],œdataTypeœ:œRecursiveCharacterTextSplitterœ,œidœ:œRecursiveCharacterTextSplitter-tR9QMœ}",
+ "target": "AstraDB-eUCSS",
+ "targetHandle": "{œfieldNameœ:œinputsœ,œidœ:œAstraDB-eUCSSœ,œinputTypesœ:null,œtypeœ:œRecordœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "inputs",
+ "id": "AstraDB-eUCSS",
+ "inputTypes": null,
+ "type": "Record"
+ },
+ "sourceHandle": {
+ "baseClasses": ["Record"],
+ "dataType": "RecursiveCharacterTextSplitter",
+ "id": "RecursiveCharacterTextSplitter-tR9QM"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-RecursiveCharacterTextSplitter-tR9QM{œbaseClassesœ:[œRecordœ],œdataTypeœ:œRecursiveCharacterTextSplitterœ,œidœ:œRecursiveCharacterTextSplitter-tR9QMœ}-AstraDB-eUCSS{œfieldNameœ:œinputsœ,œidœ:œAstraDB-eUCSSœ,œinputTypesœ:null,œtypeœ:œRecordœ}",
+ "selected": false
+ },
+ {
+ "source": "OpenAIEmbeddings-9TPjc",
+ "sourceHandle": "{œbaseClassesœ:[œEmbeddingsœ],œdataTypeœ:œOpenAIEmbeddingsœ,œidœ:œOpenAIEmbeddings-9TPjcœ}",
+ "target": "AstraDB-eUCSS",
+ "targetHandle": "{œfieldNameœ:œembeddingœ,œidœ:œAstraDB-eUCSSœ,œinputTypesœ:null,œtypeœ:œEmbeddingsœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "embedding",
+ "id": "AstraDB-eUCSS",
+ "inputTypes": null,
+ "type": "Embeddings"
+ },
+ "sourceHandle": {
+ "baseClasses": ["Embeddings"],
+ "dataType": "OpenAIEmbeddings",
+ "id": "OpenAIEmbeddings-9TPjc"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-OpenAIEmbeddings-9TPjc{œbaseClassesœ:[œEmbeddingsœ],œdataTypeœ:œOpenAIEmbeddingsœ,œidœ:œOpenAIEmbeddings-9TPjcœ}-AstraDB-eUCSS{œfieldNameœ:œembeddingœ,œidœ:œAstraDB-eUCSSœ,œinputTypesœ:null,œtypeœ:œEmbeddingsœ}",
+ "selected": false
+ },
+ {
+ "source": "AstraDBSearch-41nRz",
+ "sourceHandle": "{œbaseClassesœ:[œRecordœ],œdataTypeœ:œAstraDBSearchœ,œidœ:œAstraDBSearch-41nRzœ}",
+ "target": "TextOutput-BDknO",
+ "targetHandle": "{œfieldNameœ:œinput_valueœ,œidœ:œTextOutput-BDknOœ,œinputTypesœ:[œRecordœ,œTextœ],œtypeœ:œstrœ}",
+ "data": {
+ "targetHandle": {
+ "fieldName": "input_value",
+ "id": "TextOutput-BDknO",
+ "inputTypes": ["Record", "Text"],
+ "type": "str"
+ },
+ "sourceHandle": {
+ "baseClasses": ["Record"],
+ "dataType": "AstraDBSearch",
+ "id": "AstraDBSearch-41nRz"
+ }
+ },
+ "style": {
+ "stroke": "#555"
+ },
+ "className": "stroke-gray-900 stroke-connection",
+ "id": "reactflow__edge-AstraDBSearch-41nRz{œbaseClassesœ:[œRecordœ],œdataTypeœ:œAstraDBSearchœ,œidœ:œAstraDBSearch-41nRzœ}-TextOutput-BDknO{œfieldNameœ:œinput_valueœ,œidœ:œTextOutput-BDknOœ,œinputTypesœ:[œRecordœ,œTextœ],œtypeœ:œstrœ}"
+ }
+ ],
+ "viewport": {
+ "x": -259.6782520315529,
+ "y": 90.3428735006047,
+ "zoom": 0.2687057134854984
+ }
+ },
+ "description": "Visit https://pre-release.langflow.org/tutorials/rag-with-astradb for a detailed guide of this project.\nThis project give you both Ingestion and RAG in a single file. You'll need to visit https://astra.datastax.com/ to create an Astra DB instance, your Token and grab an API Endpoint.\nRunning this project requires you to add a file in the Files component, then define a Collection Name and click on the Play icon on the Astra DB component. \n\nAfter the ingestion ends you are ready to click on the Run button at the lower left corner and start asking questions about your data.",
+ "name": "Vector Store RAG",
+ "last_tested_version": "1.0.0a0",
+ "is_component": false
}
diff --git a/src/frontend/package-lock.json b/src/frontend/package-lock.json
index 363415ccf..69e621774 100644
--- a/src/frontend/package-lock.json
+++ b/src/frontend/package-lock.json
@@ -40,6 +40,7 @@
"class-variance-authority": "^0.6.1",
"clsx": "^1.2.1",
"cmdk": "^1.0.0",
+ "debounce-promise": "^3.1.2",
"dompurify": "^3.0.5",
"dotenv": "^16.4.5",
"esbuild": "^0.17.19",
@@ -5670,6 +5671,11 @@
"node": ">=12"
}
},
+ "node_modules/debounce-promise": {
+ "version": "3.1.2",
+ "resolved": "https://registry.npmjs.org/debounce-promise/-/debounce-promise-3.1.2.tgz",
+ "integrity": "sha512-rZHcgBkbYavBeD9ej6sP56XfG53d51CD4dnaw989YX/nZ/ZJfgRx/9ePKmTNiUiyQvh4mtrMoS3OAWW+yoYtpg=="
+ },
"node_modules/debug": {
"version": "4.3.4",
"resolved": "https://registry.npmjs.org/debug/-/debug-4.3.4.tgz",
diff --git a/src/frontend/package.json b/src/frontend/package.json
index 053a246ee..84feb826f 100644
--- a/src/frontend/package.json
+++ b/src/frontend/package.json
@@ -35,6 +35,7 @@
"class-variance-authority": "^0.6.1",
"clsx": "^1.2.1",
"cmdk": "^1.0.0",
+ "debounce-promise": "^3.1.2",
"dompurify": "^3.0.5",
"dotenv": "^16.4.5",
"esbuild": "^0.17.19",
diff --git a/src/frontend/playwright.config.ts b/src/frontend/playwright.config.ts
index 9535e0a15..eeb9497ae 100644
--- a/src/frontend/playwright.config.ts
+++ b/src/frontend/playwright.config.ts
@@ -15,7 +15,7 @@ dotenv.config({ path: path.resolve(__dirname, "../../.env") });
export default defineConfig({
testDir: "./tests",
/* Run tests in files in parallel */
- fullyParallel: true,
+ fullyParallel: false,
/* Fail the build on CI if you accidentally left test.only in the source code. */
forbidOnly: !!process.env.CI,
/* Retry on CI only */
@@ -52,18 +52,18 @@ export default defineConfig({
},
},
- {
- name: "firefox",
- use: {
- ...devices["Desktop Firefox"],
- launchOptions: {
- firefoxUserPrefs: {
- "dom.events.asyncClipboard.readText": true,
- "dom.events.testing.asyncClipboard": true,
- },
- },
- },
- },
+ // {
+ // name: "firefox",
+ // use: {
+ // ...devices["Desktop Firefox"],
+ // launchOptions: {
+ // firefoxUserPrefs: {
+ // "dom.events.asyncClipboard.readText": true,
+ // "dom.events.testing.asyncClipboard": true,
+ // },
+ // },
+ // },
+ // },
],
webServer: [
{
diff --git a/src/frontend/src/App.tsx b/src/frontend/src/App.tsx
index 3d144f194..b9e02a27d 100644
--- a/src/frontend/src/App.tsx
+++ b/src/frontend/src/App.tsx
@@ -30,10 +30,10 @@ export default function App() {
useTrackLastVisitedPath();
const removeFromTempNotificationList = useAlertStore(
- (state) => state.removeFromTempNotificationList
+ (state) => state.removeFromTempNotificationList,
);
const tempNotificationList = useAlertStore(
- (state) => state.tempNotificationList
+ (state) => state.tempNotificationList,
);
const [fetchError, setFetchError] = useState(false);
const isLoading = useFlowsManagerStore((state) => state.isLoading);
@@ -51,7 +51,7 @@ export default function App() {
const refreshVersion = useDarkStore((state) => state.refreshVersion);
const refreshStars = useDarkStore((state) => state.refreshStars);
const setGlobalVariables = useGlobalVariablesStore(
- (state) => state.setGlobalVariables
+ (state) => state.setGlobalVariables,
);
const checkHasStore = useStoreStore((state) => state.checkHasStore);
const navigate = useNavigate();
@@ -222,12 +222,19 @@ export default function App() {
id={alert.id}
removeAlert={removeAlert}
/>
+ ) : alert.type === "notice" ? (
+
) : (
- alert.type === "notice" && (
-
@@ -236,20 +243,6 @@ export default function App() {
))}
-
- {tempNotificationList.map((alert) => (
-
- {alert.type === "success" && (
-
- )}
-
- ))}
-
);
diff --git a/src/frontend/src/alerts/alertDropDown/index.tsx b/src/frontend/src/alerts/alertDropDown/index.tsx
index 3577a5de6..05f42922d 100644
--- a/src/frontend/src/alerts/alertDropDown/index.tsx
+++ b/src/frontend/src/alerts/alertDropDown/index.tsx
@@ -16,13 +16,13 @@ export default function AlertDropdown({
}: AlertDropdownType): JSX.Element {
const notificationList = useAlertStore((state) => state.notificationList);
const clearNotificationList = useAlertStore(
- (state) => state.clearNotificationList
+ (state) => state.clearNotificationList,
);
const removeFromNotificationList = useAlertStore(
- (state) => state.removeFromNotificationList
+ (state) => state.removeFromNotificationList,
);
const setNotificationCenter = useAlertStore(
- (state) => state.setNotificationCenter
+ (state) => state.setNotificationCenter,
);
const [open, setOpen] = useState(false);
@@ -36,7 +36,7 @@ export default function AlertDropdown({
}}
>
{children}
-
+
Notifications
diff --git a/src/frontend/src/alerts/error/index.tsx b/src/frontend/src/alerts/error/index.tsx
index ec23c103e..3690590b9 100644
--- a/src/frontend/src/alerts/error/index.tsx
+++ b/src/frontend/src/alerts/error/index.tsx
@@ -40,7 +40,7 @@ export default function ErrorAlert({
removeAlert(id);
}, 500);
}}
- className="error-build-message nocopy nopan nodelete nodrag noundo"
+ className="error-build-message nocopy nowheel nopan nodelete nodrag noundo"
>
diff --git a/src/frontend/src/alerts/notice/index.tsx b/src/frontend/src/alerts/notice/index.tsx
index faaa4db6a..fb29954ea 100644
--- a/src/frontend/src/alerts/notice/index.tsx
+++ b/src/frontend/src/alerts/notice/index.tsx
@@ -36,7 +36,7 @@ export default function NoticeAlert({
setShow(false);
removeAlert(id);
}}
- className="nocopy nopan nodelete nodrag noundo mt-6 w-96 rounded-md bg-info-background p-4 shadow-xl"
+ className="nocopy nowheel nopan nodelete nodrag noundo mt-6 w-96 rounded-md bg-info-background p-4 shadow-xl"
>
diff --git a/src/frontend/src/alerts/success/index.tsx b/src/frontend/src/alerts/success/index.tsx
index ec6abf589..db62c8432 100644
--- a/src/frontend/src/alerts/success/index.tsx
+++ b/src/frontend/src/alerts/success/index.tsx
@@ -34,7 +34,7 @@ export default function SuccessAlert({
setShow(false);
removeAlert(id);
}}
- className="success-alert nocopy nopan nodelete nodrag noundo"
+ className="success-alert nocopy nowheel nopan nodelete nodrag noundo"
>
diff --git a/src/frontend/src/components/accordionComponent/composite/folderAccordionComponent/index.tsx b/src/frontend/src/components/accordionComponent/composite/folderAccordionComponent/index.tsx
index 212a03fa2..d4fb95b5c 100644
--- a/src/frontend/src/components/accordionComponent/composite/folderAccordionComponent/index.tsx
+++ b/src/frontend/src/components/accordionComponent/composite/folderAccordionComponent/index.tsx
@@ -15,7 +15,7 @@ export default function FolderAccordionComponent({
options,
}: AccordionComponentType): JSX.Element {
const [value, setValue] = useState(
- open.length === 0 ? "" : getOpenAccordion()
+ open.length === 0 ? "" : getOpenAccordion(),
);
function getOpenAccordion(): string {
diff --git a/src/frontend/src/components/accordionComponent/index.tsx b/src/frontend/src/components/accordionComponent/index.tsx
index fdcf8b96c..7c5562e7f 100644
--- a/src/frontend/src/components/accordionComponent/index.tsx
+++ b/src/frontend/src/components/accordionComponent/index.tsx
@@ -6,10 +6,13 @@ import {
AccordionTrigger,
} from "../../components/ui/accordion";
import { AccordionComponentType } from "../../types/components";
+import { cn } from "../../utils/utils";
+import ShadTooltip from "../shadTooltipComponent";
export default function AccordionComponent({
trigger,
children,
+ disabled,
open = [],
keyValue,
sideBar,
@@ -29,7 +32,9 @@ export default function AccordionComponent({
}
function handleClick(): void {
- value === "" ? setValue(keyValue!) : setValue("");
+ if (!disabled) {
+ value === "" ? setValue(keyValue!) : setValue("");
+ }
}
return (
@@ -38,16 +43,18 @@ export default function AccordionComponent({
type="single"
className="w-full"
value={value}
- onValueChange={setValue}
+ onValueChange={!disabled ? setValue : () => {}}
>
{
handleClick();
}}
- className={
- sideBar ? "w-full bg-muted px-[0.75rem] py-[0.5rem]" : "ml-3"
- }
+ disabled={disabled}
+ className={cn(
+ sideBar ? "w-full bg-muted px-[0.75rem] py-[0.5rem]" : "ml-3",
+ disabled ? "cursor-not-allowed" : "cursor-pointer",
+ )}
>
{trigger}
diff --git a/src/frontend/src/components/addNewVariableButtonComponent/addNewVariableButton.tsx b/src/frontend/src/components/addNewVariableButtonComponent/addNewVariableButton.tsx
index 5da7d1461..61dada650 100644
--- a/src/frontend/src/components/addNewVariableButtonComponent/addNewVariableButton.tsx
+++ b/src/frontend/src/components/addNewVariableButtonComponent/addNewVariableButton.tsx
@@ -7,7 +7,6 @@ import { useTypesStore } from "../../stores/typesStore";
import { ResponseErrorDetailAPI } from "../../types/api";
import ForwardedIconComponent from "../genericIconComponent";
import InputComponent from "../inputComponent";
-import { Button } from "../ui/button";
import { Input } from "../ui/input";
import { Label } from "../ui/label";
import { Textarea } from "../ui/textarea";
@@ -24,19 +23,19 @@ export default function AddNewVariableButton({ children }): JSX.Element {
const setErrorData = useAlertStore((state) => state.setErrorData);
const componentFields = useTypesStore((state) => state.ComponentFields);
const unavaliableFields = new Set(
- Object.keys(useGlobalVariablesStore((state) => state.unavaliableFields)),
+ Object.keys(useGlobalVariablesStore((state) => state.unavaliableFields))
);
const availableFields = () => {
const fields = Array.from(componentFields).filter(
- (field) => !unavaliableFields.has(field),
+ (field) => !unavaliableFields.has(field)
);
return sortByName(fields);
};
const addGlobalVariable = useGlobalVariablesStore(
- (state) => state.addGlobalVariable,
+ (state) => state.addGlobalVariable
);
function handleSaveVariable() {
@@ -65,12 +64,17 @@ export default function AddNewVariableButton({ children }): JSX.Element {
let responseError = error as ResponseErrorDetailAPI;
setErrorData({
title: "Error creating variable",
- list: [responseError.response.data.detail ?? "Unknown error"],
+ list: [responseError?.response?.data?.detail ?? "Unknown error"],
});
});
}
return (
-
+
-
- Save Variable
-
+
);
}
diff --git a/src/frontend/src/components/cardComponent/components/dragCardComponent/index.tsx b/src/frontend/src/components/cardComponent/components/dragCardComponent/index.tsx
index 54dbf4846..28674f3bc 100644
--- a/src/frontend/src/components/cardComponent/components/dragCardComponent/index.tsx
+++ b/src/frontend/src/components/cardComponent/components/dragCardComponent/index.tsx
@@ -10,7 +10,7 @@ export default function DragCardComponent({ data }: { data: storeComponent }) {
draggable
//TODO check color schema
className={cn(
- "group relative flex flex-col justify-between overflow-hidden transition-all hover:bg-muted/50 hover:shadow-md hover:dark:bg-[#ffffff10]"
+ "group relative flex flex-col justify-between overflow-hidden transition-all hover:bg-muted/50 hover:shadow-md hover:dark:bg-[#ffffff10]",
)}
>
@@ -22,7 +22,7 @@ export default function DragCardComponent({ data }: { data: storeComponent }) {
"visible flex-shrink-0",
data.is_component
? "mx-0.5 h-6 w-6 text-component-icon"
- : "h-7 w-7 flex-shrink-0 text-flow-icon"
+ : "h-7 w-7 flex-shrink-0 text-flow-icon",
)}
name={data.is_component ? "ToyBrick" : "Group"}
/>
diff --git a/src/frontend/src/components/codeAreaComponent/index.tsx b/src/frontend/src/components/codeAreaComponent/index.tsx
index dd68e774a..cd8a0f77d 100644
--- a/src/frontend/src/components/codeAreaComponent/index.tsx
+++ b/src/frontend/src/components/codeAreaComponent/index.tsx
@@ -18,7 +18,7 @@ export default function CodeAreaComponent({
setOpen,
}: CodeAreaComponentType) {
const [myValue, setMyValue] = useState(
- typeof value == "string" ? value : JSON.stringify(value)
+ typeof value == "string" ? value : JSON.stringify(value),
);
useEffect(() => {
if (disabled && myValue !== "") {
diff --git a/src/frontend/src/components/dictComponent/index.tsx b/src/frontend/src/components/dictComponent/index.tsx
index 5161135ed..39850e6e3 100644
--- a/src/frontend/src/components/dictComponent/index.tsx
+++ b/src/frontend/src/components/dictComponent/index.tsx
@@ -29,7 +29,7 @@ export default function DictComponent({
1 && editNode ? "my-1" : "",
- "flex flex-col gap-3"
+ "flex flex-col gap-3",
)}
>
{
diff --git a/src/frontend/src/components/dropdownComponent/index.tsx b/src/frontend/src/components/dropdownComponent/index.tsx
index bff138681..8402d166e 100644
--- a/src/frontend/src/components/dropdownComponent/index.tsx
+++ b/src/frontend/src/components/dropdownComponent/index.tsx
@@ -59,7 +59,7 @@ export default function Dropdown({
? "dropdown-component-outline"
: "dropdown-component-false-outline",
"w-full justify-between font-normal",
- editNode ? "input-edit-node" : "py-2"
+ editNode ? "input-edit-node" : "py-2",
)}
>
@@ -107,7 +107,7 @@ export default function Dropdown({
name="Check"
className={cn(
"ml-auto h-4 w-4 text-primary",
- value === option ? "opacity-100" : "opacity-0"
+ value === option ? "opacity-100" : "opacity-0",
)}
/>
diff --git a/src/frontend/src/components/editFlowSettingsComponent/index.tsx b/src/frontend/src/components/editFlowSettingsComponent/index.tsx
index d8a6fc43e..26bc138e3 100644
--- a/src/frontend/src/components/editFlowSettingsComponent/index.tsx
+++ b/src/frontend/src/components/editFlowSettingsComponent/index.tsx
@@ -99,7 +99,7 @@ export const EditFlowSettings: React.FC = ({
{description === "" ? "No description" : description}
@@ -109,7 +109,7 @@ export const EditFlowSettings: React.FC = ({
{setEndpointName && (
-
Endpoint name:
+
Endpoint Name
{!isEndpointNameValid && (
Invalid endpoint name. Use only letters, numbers, hyphens, and
@@ -123,7 +123,7 @@ export const EditFlowSettings: React.FC = ({
type="text"
name="endpoint_name"
value={endpointName ?? ""}
- placeholder="An alternative name for the run endpoint"
+ placeholder="An alternative name to run the endpoint"
maxLength={maxLength}
id="endpoint_name"
onDoubleClickCapture={(event) => {
diff --git a/src/frontend/src/components/fetchErrorComponent/index.tsx b/src/frontend/src/components/fetchErrorComponent/index.tsx
index 956de6270..0e403c504 100644
--- a/src/frontend/src/components/fetchErrorComponent/index.tsx
+++ b/src/frontend/src/components/fetchErrorComponent/index.tsx
@@ -1,7 +1,6 @@
import BaseModal from "../../modals/baseModal";
import { fetchErrorComponentType } from "../../types/components";
import IconComponent from "../genericIconComponent";
-import { Button } from "../ui/button";
export default function FetchErrorComponent({
message,
@@ -12,7 +11,14 @@ export default function FetchErrorComponent({
}: fetchErrorComponentType) {
return (
<>
-
+ {
+ setRetry();
+ }}
+ >
-
-
-
{
- setRetry();
- }}
- >
- {isLoadingHealth ? (
-
-
-
- ) : (
- "Retry"
- )}
-
-
-
+
>
);
diff --git a/src/frontend/src/components/genericIconComponent/index.tsx b/src/frontend/src/components/genericIconComponent/index.tsx
index 398ae6e7f..9f7c687a1 100644
--- a/src/frontend/src/components/genericIconComponent/index.tsx
+++ b/src/frontend/src/components/genericIconComponent/index.tsx
@@ -18,7 +18,7 @@ export const ForwardedIconComponent = memo(
strokeWidth,
id = "",
}: IconComponentProps,
- ref
+ ref,
) => {
const [showFallback, setShowFallback] = useState(false);
@@ -65,8 +65,8 @@ export const ForwardedIconComponent = memo(
/>
);
- }
- )
+ },
+ ),
);
export default ForwardedIconComponent;
diff --git a/src/frontend/src/components/headerComponent/components/menuBar/index.tsx b/src/frontend/src/components/headerComponent/components/menuBar/index.tsx
index 7f7130090..35542ca2a 100644
--- a/src/frontend/src/components/headerComponent/components/menuBar/index.tsx
+++ b/src/frontend/src/components/headerComponent/components/menuBar/index.tsx
@@ -35,21 +35,11 @@ export const MenuBar = ({}: {}): JSX.Element => {
const navigate = useNavigate();
const isBuilding = useFlowStore((state) => state.isBuilding);
- function handleAddFlow(duplicate?: boolean) {
+ function handleAddFlow() {
try {
- if (duplicate) {
- if (!currentFlow) {
- throw new Error("No flow to duplicate");
- }
- addFlow(true, currentFlow).then((id) => {
- setSuccessData({ title: "Flow duplicated successfully" });
- navigate("/flow/" + id);
- });
- } else {
- addFlow(true).then((id) => {
- navigate("/flow/" + id);
- });
- }
+ addFlow(true).then((id) => {
+ navigate("/flow/" + id);
+ });
} catch (err) {
setErrorData(err as { title: string; list?: Array });
}
@@ -89,15 +79,6 @@ export const MenuBar = ({}: {}): JSX.Element => {
New
- {
- handleAddFlow(true);
- }}
- className="cursor-pointer"
- >
-
- Duplicate
-
{
@@ -132,7 +113,7 @@ export const MenuBar = ({}: {}): JSX.Element => {
title: UPLOAD_ERROR_ALERT,
list: [error],
});
- }
+ },
);
}}
>
@@ -214,7 +195,7 @@ export const MenuBar = ({}: {}): JSX.Element => {
name={isBuilding || saveLoading ? "Loader2" : "CheckCircle2"}
className={cn(
"h-4 w-4",
- isBuilding || saveLoading ? "animate-spin" : "animate-wiggle"
+ isBuilding || saveLoading ? "animate-spin" : "animate-wiggle",
)}
/>
{printByBuildStatus()}
diff --git a/src/frontend/src/components/headerComponent/index.tsx b/src/frontend/src/components/headerComponent/index.tsx
index 209445b17..0abd16d3d 100644
--- a/src/frontend/src/components/headerComponent/index.tsx
+++ b/src/frontend/src/components/headerComponent/index.tsx
@@ -56,7 +56,7 @@ export default function Header(): JSX.Element {
const lastFlowVisitedIndex = routeHistory
.reverse()
.findIndex(
- (path) => path.includes("/flow/") && path !== location.pathname
+ (path) => path.includes("/flow/") && path !== location.pathname,
);
const lastFlowVisited = routeHistory[lastFlowVisitedIndex];
diff --git a/src/frontend/src/components/horizontalScrollFadeComponent/index.tsx b/src/frontend/src/components/horizontalScrollFadeComponent/index.tsx
index 7e0c788a6..e0bf48917 100644
--- a/src/frontend/src/components/horizontalScrollFadeComponent/index.tsx
+++ b/src/frontend/src/components/horizontalScrollFadeComponent/index.tsx
@@ -32,11 +32,11 @@ export default function HorizontalScrollFadeComponent({
fadeContainerRef.current.classList.toggle(
"fade-left",
- isScrollable && !atStart
+ isScrollable && !atStart,
);
fadeContainerRef.current.classList.toggle(
"fade-right",
- isScrollable && !atEnd
+ isScrollable && !atEnd,
);
};
diff --git a/src/frontend/src/components/inputComponent/components/popover/index.tsx b/src/frontend/src/components/inputComponent/components/popover/index.tsx
index 52f79203f..78fc5bd3d 100644
--- a/src/frontend/src/components/inputComponent/components/popover/index.tsx
+++ b/src/frontend/src/components/inputComponent/components/popover/index.tsx
@@ -108,7 +108,7 @@ const CustomInputPopover = ({
/>
{myValue !== "" ? myValue : "No file"}
diff --git a/src/frontend/src/components/inputGlobalComponent/index.tsx b/src/frontend/src/components/inputGlobalComponent/index.tsx
index 318c57518..e9edd2e50 100644
--- a/src/frontend/src/components/inputGlobalComponent/index.tsx
+++ b/src/frontend/src/components/inputGlobalComponent/index.tsx
@@ -19,15 +19,15 @@ export default function InputGlobalComponent({
editNode = false,
}: InputGlobalComponentType): JSX.Element {
const globalVariablesEntries = useGlobalVariablesStore(
- (state) => state.globalVariablesEntries
+ (state) => state.globalVariablesEntries,
);
const getVariableId = useGlobalVariablesStore((state) => state.getVariableId);
const unavaliableFields = useGlobalVariablesStore(
- (state) => state.unavaliableFields
+ (state) => state.unavaliableFields,
);
const removeGlobalVariable = useGlobalVariablesStore(
- (state) => state.removeGlobalVariable
+ (state) => state.removeGlobalVariable,
);
const setErrorData = useAlertStore((state) => state.setErrorData);
@@ -130,7 +130,7 @@ export default function InputGlobalComponent({
diff --git a/src/frontend/src/components/inputListComponent/index.tsx b/src/frontend/src/components/inputListComponent/index.tsx
index f0aa8cca7..ce55ff1d4 100644
--- a/src/frontend/src/components/inputListComponent/index.tsx
+++ b/src/frontend/src/components/inputListComponent/index.tsx
@@ -55,10 +55,11 @@ export default function InputListComponent({
/>
{idx === value.length - 1 ? (
{
+ onClick={(e) => {
let newInputList = _.cloneDeep(value);
newInputList.push("");
onChange(newInputList);
+ e.preventDefault();
}}
data-testid={
`input-list-plus-btn${
@@ -79,10 +80,11 @@ export default function InputListComponent({
editNode ? "-edit" : ""
}_${componentName}-` + idx
}
- onClick={() => {
+ onClick={(e) => {
let newInputList = _.cloneDeep(value);
newInputList.splice(idx, 1);
onChange(newInputList);
+ e.preventDefault();
}}
disabled={disabled || playgroundDisabled}
>
diff --git a/src/frontend/src/components/shadTooltipComponent/index.tsx b/src/frontend/src/components/shadTooltipComponent/index.tsx
index 080935b6f..6bfad61ee 100644
--- a/src/frontend/src/components/shadTooltipComponent/index.tsx
+++ b/src/frontend/src/components/shadTooltipComponent/index.tsx
@@ -13,7 +13,6 @@ export default function ShadTooltip({
return (
{children}
-
void;
};
-const SideBarButtonsComponent = ({ items }: SideBarButtonsComponentProps) => {
+const SideBarButtonsComponent = ({
+ items,
+ pathname,
+}: SideBarButtonsComponentProps) => {
return (
- <>
+
{items.map((item) => (
{
data-testid={`sidebar-nav-${item.title}`}
className={cn(
buttonVariants({ variant: "ghost" }),
- "!w-[200px] cursor-pointer justify-start gap-2 border border-transparent hover:border-border hover:bg-transparent"
+ pathname === item.href
+ ? "border border-border bg-muted hover:bg-muted"
+ : "border border-transparent hover:border-border hover:bg-transparent",
+ "flex w-full shrink-0 justify-start gap-4",
)}
>
- {item.title}
+ {item.icon}
+
+ {item.title}
+
))}
- >
+
);
};
export default SideBarButtonsComponent;
diff --git a/src/frontend/src/components/sidebarComponent/components/sideBarFolderButtons/index.tsx b/src/frontend/src/components/sidebarComponent/components/sideBarFolderButtons/index.tsx
index ae40c4613..bf2353edd 100644
--- a/src/frontend/src/components/sidebarComponent/components/sideBarFolderButtons/index.tsx
+++ b/src/frontend/src/components/sidebarComponent/components/sideBarFolderButtons/index.tsx
@@ -93,24 +93,25 @@ const SideBarFoldersButtonsComponent = ({
return (
<>
-
-
-
- New Folder
+
+
Folders
+
+
-
- Upload
+
@@ -176,11 +177,11 @@ const SideBarFoldersButtonsComponent = ({
event.stopPropagation();
event.preventDefault();
}}
- className="flex w-full items-center gap-2"
+ className="flex w-full items-center gap-4"
>
{editFolderName?.edit ? (
@@ -259,14 +260,14 @@ const SideBarFoldersButtonsComponent = ({
}}
value={foldersNames[item.name]}
id={`input-folder-${item.name}`}
+ data-testid={`input-folder`}
/>
) : (
-
+
{item.name}
)}
-
{index > 0 && (
)}
- {/* {index > 0 && (
-
{
- e.stopPropagation();
- e.preventDefault();
- }}
- variant={"ghost"}
- >
-
-
- )} */}
{
@@ -305,7 +291,8 @@ const SideBarFoldersButtonsComponent = ({
e.stopPropagation();
e.preventDefault();
}}
- variant={"ghost"}
+ size="none"
+ variant="none"
>
-
-
- {!loadingFolders && folders?.length > 0 && isFolderPath && (
-
+ {items.length > 0 ? (
+
+ ) : (
+ !loadingFolders &&
+ folders?.length > 0 &&
+ isFolderPath && (
+
+ )
)}
diff --git a/src/frontend/src/components/tableAutoCellRender/index.tsx b/src/frontend/src/components/tableAutoCellRender/index.tsx
index b11cc14ad..21b598dd2 100644
--- a/src/frontend/src/components/tableAutoCellRender/index.tsx
+++ b/src/frontend/src/components/tableAutoCellRender/index.tsx
@@ -34,7 +34,7 @@ export default function TableAutoCellRender({
variant="outline"
size="sq"
className={cn(
- "min-w-min bg-success-background text-success-foreground hover:bg-success-background"
+ "min-w-min bg-success-background text-success-foreground hover:bg-success-background",
)}
>
{value}
diff --git a/src/frontend/src/components/tagsSelectorComponent/index.tsx b/src/frontend/src/components/tagsSelectorComponent/index.tsx
index 6f84ae7c3..4e3e181fc 100644
--- a/src/frontend/src/components/tagsSelectorComponent/index.tsx
+++ b/src/frontend/src/components/tagsSelectorComponent/index.tsx
@@ -48,7 +48,7 @@ export function TagsSelector({
className={cn(
selectedTags.some((category) => category === tag.name)
? "min-w-min bg-beta-foreground text-background hover:bg-beta-foreground"
- : ""
+ : "",
)}
>
{tag.name}
diff --git a/src/frontend/src/components/ui/accordion.tsx b/src/frontend/src/components/ui/accordion.tsx
index b0eaaf47a..39578095b 100644
--- a/src/frontend/src/components/ui/accordion.tsx
+++ b/src/frontend/src/components/ui/accordion.tsx
@@ -4,6 +4,7 @@ import * as AccordionPrimitive from "@radix-ui/react-accordion";
import { ChevronDownIcon } from "@radix-ui/react-icons";
import * as React from "react";
import { cn } from "../../utils/utils";
+import ShadTooltip from "../shadTooltipComponent";
const Accordion = AccordionPrimitive.Root;
@@ -22,17 +23,33 @@ AccordionItem.displayName = "AccordionItem";
const AccordionTrigger = React.forwardRef<
React.ElementRef,
React.ComponentPropsWithoutRef
->(({ className, children, ...props }, ref) => (
+>(({ className, children, disabled, ...props }, ref) => (
-
+
svg]:rotate-180",
- className
+ className,
)}
>
{children}
-
+
+
+
@@ -47,7 +64,7 @@ const AccordionContent = React.forwardRef<
ref={ref}
className={cn(
"data-[state=closed]:animate-accordion-up data-[state=open]:animate-accordion-down overflow-hidden text-sm",
- className
+ className,
)}
{...props}
>
diff --git a/src/frontend/src/components/ui/alert.tsx b/src/frontend/src/components/ui/alert.tsx
index a584cea26..05c5b7fa7 100644
--- a/src/frontend/src/components/ui/alert.tsx
+++ b/src/frontend/src/components/ui/alert.tsx
@@ -15,7 +15,7 @@ const alertVariants = cva(
defaultVariants: {
variant: "default",
},
- }
+ },
);
const Alert = React.forwardRef<
diff --git a/src/frontend/src/components/ui/button.tsx b/src/frontend/src/components/ui/button.tsx
index 431287f39..cbcadf2c3 100644
--- a/src/frontend/src/components/ui/button.tsx
+++ b/src/frontend/src/components/ui/button.tsx
@@ -2,9 +2,10 @@ import { Slot } from "@radix-ui/react-slot";
import { cva, type VariantProps } from "class-variance-authority";
import * as React from "react";
import { cn } from "../../utils/utils";
+import ForwardedIconComponent from "../genericIconComponent";
const buttonVariants = cva(
- "inline-flex items-center justify-center rounded-md text-sm font-medium transition-colors focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:opacity-50 disabled:pointer-events-none ring-offset-background",
+ "inline-flex items-center justify-center gap-2 rounded-md text-sm font-medium transition-colors focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:opacity-50 disabled:pointer-events-none ring-offset-background",
{
variants: {
variant: {
@@ -19,6 +20,7 @@ const buttonVariants = cva(
"border border-muted bg-muted text-secondary-foreground hover:bg-secondary-foreground/5",
ghost: "hover:bg-accent hover:text-accent-foreground",
link: "underline-offset-4 hover:underline text-primary",
+ none: "",
},
size: {
default: "h-10 py-2 px-4",
@@ -26,19 +28,21 @@ const buttonVariants = cva(
xs: "py-0.5 px-3 rounded-md",
lg: "h-11 px-8 rounded-md",
icon: "py-1 px-1 rounded-md",
+ none: "",
},
},
defaultVariants: {
variant: "default",
size: "default",
},
- }
+ },
);
export interface ButtonProps
extends React.ButtonHTMLAttributes,
VariantProps {
asChild?: boolean;
+ loading?: boolean;
}
function toTitleCase(text: string) {
@@ -49,21 +53,49 @@ function toTitleCase(text: string) {
}
const Button = React.forwardRef(
- ({ className, variant, size, asChild = false, children, ...props }, ref) => {
+ (
+ {
+ className,
+ variant,
+ size,
+ loading,
+ disabled,
+ asChild = false,
+ children,
+ ...props
+ },
+ ref,
+ ) => {
const Comp = asChild ? Slot : "button";
let newChildren = children;
if (typeof children === "string") {
newChildren = toTitleCase(children);
}
return (
-
+ <>
+
+ {loading ? (
+
+ {newChildren}
+
+
+
+
+ ) : (
+ newChildren
+ )}
+
+ >
);
- }
+ },
);
Button.displayName = "Button";
diff --git a/src/frontend/src/components/ui/card.tsx b/src/frontend/src/components/ui/card.tsx
index 459396f46..a558fb645 100644
--- a/src/frontend/src/components/ui/card.tsx
+++ b/src/frontend/src/components/ui/card.tsx
@@ -8,8 +8,8 @@ const Card = React.forwardRef<
@@ -36,7 +36,7 @@ const CardTitle = React.forwardRef<
ref={ref}
className={cn(
"text-base font-semibold leading-tight tracking-tight",
- className
+ className,
)}
{...props}
/>
diff --git a/src/frontend/src/components/ui/checkbox.tsx b/src/frontend/src/components/ui/checkbox.tsx
index 2c48fde84..1c3d1e4fe 100644
--- a/src/frontend/src/components/ui/checkbox.tsx
+++ b/src/frontend/src/components/ui/checkbox.tsx
@@ -13,7 +13,7 @@ const Checkbox = React.forwardRef<
ref={ref}
className={cn(
"peer h-4 w-4 shrink-0 rounded-sm border border-primary ring-offset-background focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50 data-[state=checked]:bg-primary data-[state=checked]:text-primary-foreground",
- className
+ className,
)}
{...props}
>
@@ -37,7 +37,7 @@ const CheckBoxDiv = ({
className={cn(
className,
"peer h-4 w-4 shrink-0 rounded-sm border border-primary ring-offset-background focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50",
- checked ? "bg-primary text-primary-foreground" : ""
+ checked ? "bg-primary text-primary-foreground" : "",
)}
>
{checked && (
diff --git a/src/frontend/src/components/ui/custom-accordion.tsx b/src/frontend/src/components/ui/custom-accordion.tsx
index 9ce39b8a7..507649ca5 100644
--- a/src/frontend/src/components/ui/custom-accordion.tsx
+++ b/src/frontend/src/components/ui/custom-accordion.tsx
@@ -24,7 +24,7 @@ const AccordionTrigger = React.forwardRef<
svg]:rotate-180",
- className
+ className,
)}
>
{children}
@@ -43,7 +43,7 @@ const AccordionContent = React.forwardRef<
ref={ref}
className={cn(
"data-[state=closed]:animate-accordion-up data-[state=open]:animate-accordion-down overflow-hidden border-[1px] text-sm data-[state=open]:rounded-b-md data-[state=open]:border-t-0 data-[state=open]:bg-muted",
- className
+ className,
)}
{...props}
>
diff --git a/src/frontend/src/components/ui/form.tsx b/src/frontend/src/components/ui/form.tsx
index dcfd39449..69789b6f3 100644
--- a/src/frontend/src/components/ui/form.tsx
+++ b/src/frontend/src/components/ui/form.tsx
@@ -16,18 +16,18 @@ const Form = FormProvider;
type FormFieldContextValue<
TFieldValues extends FieldValues = FieldValues,
- TName extends FieldPath
= FieldPath
+ TName extends FieldPath = FieldPath,
> = {
name: TName;
};
const FormFieldContext = React.createContext(
- {} as FormFieldContextValue
+ {} as FormFieldContextValue,
);
const FormField = <
TFieldValues extends FieldValues = FieldValues,
- TName extends FieldPath = FieldPath
+ TName extends FieldPath = FieldPath,
>({
...props
}: ControllerProps) => {
@@ -66,7 +66,7 @@ type FormItemContextValue = {
};
const FormItemContext = React.createContext(
- {} as FormItemContextValue
+ {} as FormItemContextValue,
);
const FormItem = React.forwardRef<
diff --git a/src/frontend/src/components/ui/tooltip.tsx b/src/frontend/src/components/ui/tooltip.tsx
index 4120219ec..3ee8b9c7a 100644
--- a/src/frontend/src/components/ui/tooltip.tsx
+++ b/src/frontend/src/components/ui/tooltip.tsx
@@ -20,7 +20,7 @@ const TooltipContent = React.forwardRef<
sideOffset={sideOffset}
className={cn(
"z-45 overflow-y-auto rounded-md border bg-popover px-3 py-1.5 text-sm text-popover-foreground shadow-md animate-in fade-in-50 data-[side=bottom]:slide-in-from-top-1 data-[side=left]:slide-in-from-right-1 data-[side=right]:slide-in-from-left-1 data-[side=top]:slide-in-from-bottom-1",
- className
+ className,
)}
{...props}
/>
@@ -28,4 +28,26 @@ const TooltipContent = React.forwardRef<
));
TooltipContent.displayName = TooltipPrimitive.Content.displayName;
-export { Tooltip, TooltipContent, TooltipProvider, TooltipTrigger };
+const TooltipContentWithoutPortal = React.forwardRef<
+ React.ElementRef,
+ React.ComponentPropsWithoutRef
+>(({ className, sideOffset = 4, ...props }, ref) => (
+
+));
+TooltipContentWithoutPortal.displayName = TooltipPrimitive.Content.displayName;
+
+export {
+ Tooltip,
+ TooltipContent,
+ TooltipContentWithoutPortal,
+ TooltipProvider,
+ TooltipTrigger,
+};
diff --git a/src/frontend/src/constants/constants.ts b/src/frontend/src/constants/constants.ts
index 47563b9c3..b0a0b55c5 100644
--- a/src/frontend/src/constants/constants.ts
+++ b/src/frontend/src/constants/constants.ts
@@ -590,6 +590,7 @@ export const CONTROL_PATCH_USER_STATE = {
password: "",
cnfPassword: "",
gradient: "",
+ apikey: "",
};
export const CONTROL_LOGIN_STATE = {
diff --git a/src/frontend/src/controllers/API/api.tsx b/src/frontend/src/controllers/API/api.tsx
index d630c474f..560519d94 100644
--- a/src/frontend/src/controllers/API/api.tsx
+++ b/src/frontend/src/controllers/API/api.tsx
@@ -2,11 +2,11 @@ import axios, { AxiosError, AxiosInstance } from "axios";
import { useContext, useEffect } from "react";
import { Cookies } from "react-cookie";
import { renewAccessToken } from ".";
-import { AUTHORIZED_DUPLICATE_REQUESTS } from "../../constants/constants";
import { BuildStatus } from "../../constants/enums";
import { AuthContext } from "../../contexts/authContext";
import useAlertStore from "../../stores/alertStore";
import useFlowStore from "../../stores/flowStore";
+import { checkDuplicateRequestAndStoreRequest } from "./helpers/check-duplicate-requests";
// Create a new Axios instance
const api: AxiosInstance = axios.create({
@@ -81,28 +81,12 @@ function ApiInterceptor() {
// Request interceptor to add access token to every request
const requestInterceptor = api.interceptors.request.use(
(config) => {
- const lastUrl = localStorage.getItem("lastUrlCalled");
- const lastMethodCalled = localStorage.getItem("lastMethodCalled");
+ const checkRequest = checkDuplicateRequestAndStoreRequest(config);
- const isContained = AUTHORIZED_DUPLICATE_REQUESTS.some((request) =>
- config?.url!.includes(request),
- );
-
- if (
- config?.url === lastUrl &&
- !isContained &&
- lastMethodCalled === config.method
- ) {
- return Promise.reject("Duplicate request");
+ if (!checkRequest) {
+ return Promise.reject("Duplicate request.");
}
- localStorage.setItem("lastUrlCalled", config.url ?? "");
- localStorage.setItem("lastMethodCalled", config.method ?? "");
- localStorage.setItem(
- "lastRequestData",
- JSON.stringify(config.data) ?? "",
- );
-
const accessToken = cookies.get("access_token_lf");
if (accessToken && !isAuthorizedURL(config?.url)) {
config.headers["Authorization"] = `Bearer ${accessToken}`;
diff --git a/src/frontend/src/controllers/API/helpers/check-duplicate-requests.ts b/src/frontend/src/controllers/API/helpers/check-duplicate-requests.ts
new file mode 100644
index 000000000..79a47c7a7
--- /dev/null
+++ b/src/frontend/src/controllers/API/helpers/check-duplicate-requests.ts
@@ -0,0 +1,30 @@
+import { AUTHORIZED_DUPLICATE_REQUESTS } from "../../../constants/constants";
+
+export function checkDuplicateRequestAndStoreRequest(config) {
+ const lastUrl = localStorage.getItem("lastUrlCalled");
+ const lastMethodCalled = localStorage.getItem("lastMethodCalled");
+ const lastRequestTime = localStorage.getItem("lastRequestTime");
+
+ const currentTime = Date.now();
+
+ const isContained = AUTHORIZED_DUPLICATE_REQUESTS.some((request) =>
+ config?.url!.includes(request),
+ );
+
+ if (
+ config?.url === lastUrl &&
+ !isContained &&
+ lastMethodCalled === config.method &&
+ lastMethodCalled === "get" && // Assuming you want to check only for GET requests
+ lastRequestTime &&
+ currentTime - parseInt(lastRequestTime, 10) < 800
+ ) {
+ return false;
+ }
+
+ localStorage.setItem("lastUrlCalled", config.url ?? "");
+ localStorage.setItem("lastMethodCalled", config.method ?? "");
+ localStorage.setItem("lastRequestTime", currentTime.toString());
+
+ return true;
+}
diff --git a/src/frontend/src/controllers/API/index.ts b/src/frontend/src/controllers/API/index.ts
index 2d0f46da3..3fed3f4da 100644
--- a/src/frontend/src/controllers/API/index.ts
+++ b/src/frontend/src/controllers/API/index.ts
@@ -61,7 +61,7 @@ export async function sendAll(data: sendAllProps) {
}
export async function postValidateCode(
- code: string
+ code: string,
): Promise> {
return await api.post(`${BASE_URL_API}validate/code`, { code });
}
@@ -76,7 +76,7 @@ export async function postValidateCode(
export async function postValidatePrompt(
name: string,
template: string,
- frontend_node: APIClassType
+ frontend_node: APIClassType,
): Promise> {
return api.post(`${BASE_URL_API}validate/prompt`, {
name,
@@ -149,7 +149,7 @@ export async function saveFlowToDatabase(newFlow: {
* @throws Will throw an error if the update fails.
*/
export async function updateFlowInDatabase(
- updatedFlow: FlowType
+ updatedFlow: FlowType,
): Promise {
try {
const response = await api.patch(`${BASE_URL_API}flows/${updatedFlow.id}`, {
@@ -327,7 +327,7 @@ export async function getHealth() {
*
*/
export async function getBuildStatus(
- flowId: string
+ flowId: string,
): Promise> {
return await api.get(`${BASE_URL_API}build/${flowId}/status`);
}
@@ -340,7 +340,7 @@ export async function getBuildStatus(
*
*/
export async function postBuildInit(
- flow: FlowType
+ flow: FlowType,
): Promise> {
return await api.post(`${BASE_URL_API}build/init/${flow.id}`, flow);
}
@@ -356,7 +356,7 @@ export async function postBuildInit(
*/
export async function uploadFile(
file: File,
- id: string
+ id: string,
): Promise> {
const formData = new FormData();
formData.append("file", file);
@@ -365,7 +365,7 @@ export async function uploadFile(
export async function postCustomComponent(
code: string,
- apiClass: APIClassType
+ apiClass: APIClassType,
): Promise> {
// let template = apiClass.template;
return await api.post(`${BASE_URL_API}custom_component`, {
@@ -378,7 +378,7 @@ export async function postCustomComponentUpdate(
code: string,
template: APITemplateType,
field: string,
- field_value: any
+ field_value: any,
): Promise> {
return await api.post(`${BASE_URL_API}custom_component/update`, {
code,
@@ -400,7 +400,7 @@ export async function onLogin(user: LoginType) {
headers: {
"Content-Type": "application/x-www-form-urlencoded",
},
- }
+ },
);
if (response.status === 200) {
@@ -462,11 +462,11 @@ export async function addUser(user: UserInputType): Promise> {
export async function getUsersPage(
skip: number,
- limit: number
+ limit: number,
): Promise> {
try {
const res = await api.get(
- `${BASE_URL_API}users/?skip=${skip}&limit=${limit}`
+ `${BASE_URL_API}users/?skip=${skip}&limit=${limit}`,
);
if (res.status === 200) {
return res.data;
@@ -503,7 +503,7 @@ export async function resetPassword(user_id: string, user: resetPasswordType) {
try {
const res = await api.patch(
`${BASE_URL_API}users/${user_id}/reset-password`,
- user
+ user,
);
if (res.status === 200) {
return res.data;
@@ -577,7 +577,7 @@ export async function saveFlowStore(
last_tested_version?: string;
},
tags: string[],
- publicFlow = false
+ publicFlow = false,
): Promise {
try {
const response = await api.post(`${BASE_URL_API}store/components/`, {
@@ -706,7 +706,7 @@ export async function postStoreComponents(component: Component) {
export async function getComponent(component_id: string) {
try {
const res = await api.get(
- `${BASE_URL_API}store/components/${component_id}`
+ `${BASE_URL_API}store/components/${component_id}`,
);
if (res.status === 200) {
return res.data;
@@ -721,7 +721,7 @@ export async function searchComponent(
page?: number | null,
limit?: number | null,
status?: string | null,
- tags?: string[]
+ tags?: string[],
): Promise {
try {
let url = `${BASE_URL_API}store/components/`;
@@ -833,7 +833,7 @@ export async function updateFlowStore(
},
tags: string[],
publicFlow = false,
- id: string
+ id: string,
): Promise {
try {
const response = await api.patch(`${BASE_URL_API}store/components/${id}`, {
@@ -917,7 +917,7 @@ export async function deleteGlobalVariable(id: string) {
export async function updateGlobalVariable(
name: string,
value: string,
- id: string
+ id: string,
) {
try {
const response = api.patch(`${BASE_URL_API}variables/${id}`, {
@@ -936,7 +936,7 @@ export async function getVerticesOrder(
startNodeId?: string | null,
stopNodeId?: string | null,
nodes?: Node[],
- Edges?: Edge[]
+ Edges?: Edge[],
): Promise> {
// nodeId is optional and is a query parameter
// if nodeId is not provided, the API will return all vertices
@@ -956,19 +956,19 @@ export async function getVerticesOrder(
return await api.post(
`${BASE_URL_API}build/${flowId}/vertices`,
data,
- config
+ config,
);
}
export async function postBuildVertex(
flowId: string,
vertexId: string,
- input_value: string
+ input_value: string,
): Promise> {
// input_value is optional and is a query parameter
return await api.post(
`${BASE_URL_API}build/${flowId}/vertices/${vertexId}`,
- input_value ? { inputs: { input_value: input_value } } : undefined
+ input_value ? { inputs: { input_value: input_value } } : undefined,
);
}
@@ -992,7 +992,7 @@ export async function getFlowPool({
}
export async function deleteFlowPool(
- flowId: string
+ flowId: string,
): Promise> {
const config = {};
config["params"] = { flow_id: flowId };
@@ -1000,7 +1000,7 @@ export async function deleteFlowPool(
}
export async function multipleDeleteFlowsComponents(
- flowIds: string[]
+ flowIds: string[],
): Promise> {
return await api.post(`${BASE_URL_API}flows/multiple_delete/`, {
flow_ids: flowIds,
@@ -1010,7 +1010,7 @@ export async function multipleDeleteFlowsComponents(
export async function getTransactionTable(
id: string,
mode: "intersection" | "union",
- params = {}
+ params = {},
): Promise<{ rows: Array; columns: Array }> {
const config = {};
config["params"] = { flow_id: id };
@@ -1025,7 +1025,7 @@ export async function getTransactionTable(
export async function getMessagesTable(
id: string,
mode: "intersection" | "union",
- params = {}
+ params = {},
): Promise<{ rows: Array; columns: Array }> {
const config = {};
config["params"] = { flow_id: id };
diff --git a/src/frontend/src/customNodes/genericNode/components/parameterComponent/index.tsx b/src/frontend/src/customNodes/genericNode/components/parameterComponent/index.tsx
index 6d2c58643..cb49554c6 100644
--- a/src/frontend/src/customNodes/genericNode/components/parameterComponent/index.tsx
+++ b/src/frontend/src/customNodes/genericNode/components/parameterComponent/index.tsx
@@ -89,7 +89,7 @@ export default function ParameterComponent({
debouncedHandleUpdateValues,
setNode,
renderTooltips,
- setIsLoading
+ setIsLoading,
);
const { handleNodeClass: handleNodeClassHook } = useHandleNodeClass(
@@ -98,7 +98,7 @@ export default function ParameterComponent({
takeSnapshot,
setNode,
updateNodeInternals,
- renderTooltips
+ renderTooltips,
);
const { handleRefreshButtonPress: handleRefreshButtonPressHook } =
@@ -107,7 +107,7 @@ export default function ParameterComponent({
let disabled =
edges.some(
(edge) =>
- edge.targetHandle === scapedJSONStringfy(proxy ? { ...id, proxy } : id)
+ edge.targetHandle === scapedJSONStringfy(proxy ? { ...id, proxy } : id),
) ?? false;
const handleRefreshButtonPress = async (name, data) => {
@@ -120,12 +120,12 @@ export default function ParameterComponent({
handleUpdateValues,
setNode,
renderTooltips,
- setIsLoading
+ setIsLoading,
);
const handleOnNewValue = async (
newValue: string | string[] | boolean | Object[],
- skipSnapshot: boolean | undefined = false
+ skipSnapshot: boolean | undefined = false,
): Promise => {
handleOnNewValueHook(newValue, skipSnapshot);
};
@@ -207,7 +207,7 @@ export default function ParameterComponent({
className={classNames(
left ? "my-12 -ml-0.5 " : " my-12 -mr-0.5 ",
"h-3 w-3 rounded-full border-2 bg-background",
- !showNode ? "mt-0" : ""
+ !showNode ? "mt-0" : "",
)}
style={{
borderColor: color ?? nodeColors.unknown,
@@ -296,7 +296,7 @@ export default function ParameterComponent({
}
className={classNames(
left ? "-ml-0.5" : "-mr-0.5",
- "h-3 w-3 rounded-full border-2 bg-background"
+ "h-3 w-3 rounded-full border-2 bg-background",
)}
style={{ borderColor: color ?? nodeColors.unknown }}
onClick={() => setFilterEdge(groupedEdge.current)}
diff --git a/src/frontend/src/customNodes/genericNode/components/tooltipRenderComponent/index.tsx b/src/frontend/src/customNodes/genericNode/components/tooltipRenderComponent/index.tsx
index c76bc7293..ed2760161 100644
--- a/src/frontend/src/customNodes/genericNode/components/tooltipRenderComponent/index.tsx
+++ b/src/frontend/src/customNodes/genericNode/components/tooltipRenderComponent/index.tsx
@@ -24,7 +24,7 @@ const TooltipRenderComponent = ({ item, index, left }) => {
0 ? "mt-2 flex items-center" : "mt-3 flex items-center"
+ index > 0 ? "mt-2 flex items-center" : "mt-3 flex items-center",
)}
>
{iconNodeRender()}
{showNode && (
@@ -452,7 +453,7 @@ export default function GenericNode({
{
+ onClick={(event) => {
if (nameEditable) {
setInputName(true);
}
@@ -466,21 +467,6 @@ export default function GenericNode({
{data.node?.display_name}
- {nameEditable && (
-
{
- setInputName(true);
- takeSnapshot();
- event.stopPropagation();
- event.preventDefault();
- }}
- >
-
-
- )}
)}
@@ -718,14 +704,14 @@ export default function GenericNode({
) : (
{
+ onClick={(e) => {
setInputDescription(true);
takeSnapshot();
}}
diff --git a/src/frontend/src/customNodes/hooks/use-fetch-data-on-mount.tsx b/src/frontend/src/customNodes/hooks/use-fetch-data-on-mount.tsx
index f87b0b48a..7426164f7 100644
--- a/src/frontend/src/customNodes/hooks/use-fetch-data-on-mount.tsx
+++ b/src/frontend/src/customNodes/hooks/use-fetch-data-on-mount.tsx
@@ -40,7 +40,7 @@ const useFetchDataOnMount = (
setErrorData({
title: "Error while updating the Component",
- list: [responseError.response.data.detail ?? "Unknown error"],
+ list: [responseError?.response?.data?.detail ?? "Unknown error"],
});
}
setIsLoading(false);
diff --git a/src/frontend/src/customNodes/hooks/use-handle-new-value.tsx b/src/frontend/src/customNodes/hooks/use-handle-new-value.tsx
index dd394fca7..dc416646b 100644
--- a/src/frontend/src/customNodes/hooks/use-handle-new-value.tsx
+++ b/src/frontend/src/customNodes/hooks/use-handle-new-value.tsx
@@ -44,7 +44,9 @@ const useHandleOnNewValue = (
let responseError = error as ResponseErrorTypeAPI;
setErrorData({
title: "Error while updating the Component",
- list: [responseError.response.data.detail.error ?? "Unknown error"],
+ list: [
+ responseError?.response?.data?.detail.error ?? "Unknown error",
+ ],
});
}
setIsLoading(false);
diff --git a/src/frontend/src/customNodes/hooks/use-handle-node-class.tsx b/src/frontend/src/customNodes/hooks/use-handle-node-class.tsx
index bdcf1f8cc..412658d77 100644
--- a/src/frontend/src/customNodes/hooks/use-handle-node-class.tsx
+++ b/src/frontend/src/customNodes/hooks/use-handle-node-class.tsx
@@ -6,7 +6,7 @@ const useHandleNodeClass = (
takeSnapshot,
setNode,
updateNodeInternals,
- renderTooltips
+ renderTooltips,
) => {
const handleNodeClass = (newNodeClass, code) => {
if (!data.node) return;
diff --git a/src/frontend/src/customNodes/hooks/use-handle-refresh-buttons.tsx b/src/frontend/src/customNodes/hooks/use-handle-refresh-buttons.tsx
index 19f2a3c29..4696aa994 100644
--- a/src/frontend/src/customNodes/hooks/use-handle-refresh-buttons.tsx
+++ b/src/frontend/src/customNodes/hooks/use-handle-refresh-buttons.tsx
@@ -26,7 +26,7 @@ const useHandleRefreshButtonPress = (setIsLoading, setNode, renderTooltips) => {
setErrorData({
title: "Error while updating the Component",
- list: [responseError.response.data.detail ?? "Unknown error"],
+ list: [responseError?.response?.data?.detail ?? "Unknown error"],
});
}
setIsLoading(false);
diff --git a/src/frontend/src/modals/IOModal/index.tsx b/src/frontend/src/modals/IOModal/index.tsx
index 82350b57e..e956c5262 100644
--- a/src/frontend/src/modals/IOModal/index.tsx
+++ b/src/frontend/src/modals/IOModal/index.tsx
@@ -3,7 +3,6 @@ import AccordionComponent from "../../components/accordionComponent";
import IconComponent from "../../components/genericIconComponent";
import ShadTooltip from "../../components/shadTooltipComponent";
import { Badge } from "../../components/ui/badge";
-import { Button } from "../../components/ui/button";
import {
Tabs,
TabsContent,
@@ -92,6 +91,7 @@ export default function IOModal({
await buildFlow({
input_value: chatValue,
startNodeId: chatInput?.id,
+ silent: true,
}).catch((err) => {
console.error(err);
setLockChat(false);
@@ -121,6 +121,7 @@ export default function IOModal({
open={open}
setOpen={setOpen}
disable={disable}
+ onSubmit={() => sendMessage(1)}
>
{children}
{/* TODO ADAPT TO ALL TYPES OF INPUTS AND OUTPUTS */}
@@ -253,6 +254,10 @@ export default function IOModal({
key={index}
>
{!haveChat ? (
-
-
- sendMessage(1)}
- >
+
- Run Flow
-
-
-
+ ),
+ }}
+ />
) : (
<>>
)}
diff --git a/src/frontend/src/modals/apiModal/utils/get-curl-code.tsx b/src/frontend/src/modals/apiModal/utils/get-curl-code.tsx
index 40257d73d..c17118d1a 100644
--- a/src/frontend/src/modals/apiModal/utils/get-curl-code.tsx
+++ b/src/frontend/src/modals/apiModal/utils/get-curl-code.tsx
@@ -8,14 +8,14 @@ export function getCurlRunCode(
flowId: string,
isAuth: boolean,
tweaksBuildedObject,
- endpointName?: string
+ endpointName?: string,
): string {
const tweaksObject = tweaksBuildedObject[0];
// show the endpoint name in the curl command if it exists
return `curl -X POST \\
"${window.location.protocol}//${window.location.host}/api/v1/run/${
- endpointName || flowId
- }?stream=false" \\
+ endpointName || flowId
+ }?stream=false" \\
-H 'Content-Type: application/json'\\${
!isAuth ? `\n -H 'x-api-key: '\\` : ""
}
diff --git a/src/frontend/src/modals/apiModal/utils/get-python-api-code.tsx b/src/frontend/src/modals/apiModal/utils/get-python-api-code.tsx
index a15c3ea94..a25e993fc 100644
--- a/src/frontend/src/modals/apiModal/utils/get-python-api-code.tsx
+++ b/src/frontend/src/modals/apiModal/utils/get-python-api-code.tsx
@@ -8,7 +8,7 @@
export default function getPythonApiCode(
flowId: string,
isAuth: boolean,
- tweaksBuildedObject
+ tweaksBuildedObject,
): string {
const tweaksObject = tweaksBuildedObject[0];
return `import requests
diff --git a/src/frontend/src/modals/apiModal/utils/get-python-code.tsx b/src/frontend/src/modals/apiModal/utils/get-python-code.tsx
index 9fd8048f2..ff571b492 100644
--- a/src/frontend/src/modals/apiModal/utils/get-python-code.tsx
+++ b/src/frontend/src/modals/apiModal/utils/get-python-code.tsx
@@ -6,7 +6,7 @@
*/
export default function getPythonCode(
flowName: string,
- tweaksBuildedObject
+ tweaksBuildedObject,
): string {
const tweaksObject = tweaksBuildedObject[0];
diff --git a/src/frontend/src/modals/apiModal/utils/tabs-array.tsx b/src/frontend/src/modals/apiModal/utils/tabs-array.tsx
index adc15f932..deed73332 100644
--- a/src/frontend/src/modals/apiModal/utils/tabs-array.tsx
+++ b/src/frontend/src/modals/apiModal/utils/tabs-array.tsx
@@ -1,7 +1,7 @@
export function createTabsArray(
codes,
includeWebhookCurl = false,
- includeTweaks = false
+ includeTweaks = false,
) {
const tabs = [
{
diff --git a/src/frontend/src/modals/apiModal/views/index.tsx b/src/frontend/src/modals/apiModal/views/index.tsx
index 1d93c979a..62ee7a4e5 100644
--- a/src/frontend/src/modals/apiModal/views/index.tsx
+++ b/src/frontend/src/modals/apiModal/views/index.tsx
@@ -35,7 +35,7 @@ const ApiModal = forwardRef(
flow: FlowType;
children: ReactNode;
},
- ref
+ ref,
) => {
const tweak = useTweaksStore((state) => state.tweak);
const addTweaks = useTweaksStore((state) => state.setTweak);
@@ -51,12 +51,12 @@ const ApiModal = forwardRef(
flow?.id,
autoLogin,
tweak,
- flow?.endpoint_name
+ flow?.endpoint_name,
);
const curl_webhook_code = getCurlWebhookCode(
flow?.id,
autoLogin,
- flow?.endpoint_name
+ flow?.endpoint_name,
);
const pythonCode = getPythonCode(flow?.name, tweak);
const widgetCode = getWidgetCode(flow?.id, flow?.name, autoLogin);
@@ -72,7 +72,7 @@ const ApiModal = forwardRef(
pythonCode,
];
const [tabs, setTabs] = useState(
- createTabsArray(codesArray, includeWebhook)
+ createTabsArray(codesArray, includeWebhook),
);
const canShowTweaks =
@@ -121,7 +121,7 @@ const ApiModal = forwardRef(
buildTweakObject(
nodeId,
element.data.node.template[templateField].value,
- element.data.node.template[templateField]
+ element.data.node.template[templateField],
);
}
});
@@ -138,7 +138,7 @@ const ApiModal = forwardRef(
async function buildTweakObject(
tw: string,
changes: string | string[] | boolean | number | Object[] | Object,
- template: TemplateVariableType
+ template: TemplateVariableType,
) {
changes = getChangesType(changes, template);
@@ -180,7 +180,7 @@ const ApiModal = forwardRef(
flow?.id,
autoLogin,
cloneTweak,
- flow?.endpoint_name
+ flow?.endpoint_name,
);
const pythonCode = getPythonCode(flow?.name, cloneTweak);
const widgetCode = getWidgetCode(flow?.id, flow?.name, autoLogin);
@@ -224,7 +224,7 @@ const ApiModal = forwardRef(
);
- }
+ },
);
export default ApiModal;
diff --git a/src/frontend/src/modals/baseModal/helpers/switch-case-size.ts b/src/frontend/src/modals/baseModal/helpers/switch-case-size.ts
new file mode 100644
index 000000000..35004c2ec
--- /dev/null
+++ b/src/frontend/src/modals/baseModal/helpers/switch-case-size.ts
@@ -0,0 +1,67 @@
+export const switchCaseModalSize = (size: string) => {
+ let minWidth: string;
+ let height: string;
+ switch (size) {
+ case "x-small":
+ minWidth = "min-w-[20vw]";
+ height = "h-full";
+ break;
+ case "smaller":
+ minWidth = "min-w-[40vw]";
+ height = "h-[11rem]";
+ break;
+ case "smaller-h-full":
+ minWidth = "min-w-[40vw]";
+ height = "h-full";
+ break;
+ case "small":
+ minWidth = "min-w-[40vw]";
+ height = "h-[40vh]";
+ break;
+ case "small-h-full":
+ minWidth = "min-w-[40vw]";
+ height = "h-full";
+ break;
+ case "medium":
+ minWidth = "min-w-[60vw]";
+ height = "h-[60vh]";
+ break;
+ case "medium-h-full":
+ minWidth = "min-w-[60vw]";
+ height = "h-full";
+
+ break;
+ case "large":
+ minWidth = "min-w-[85vw]";
+ height = "h-[80vh]";
+ break;
+ case "three-cards":
+ minWidth = "min-w-[1066px]";
+ height = "h-fit";
+ break;
+ case "large-thin":
+ minWidth = "min-w-[65vw]";
+ height = "h-[80vh]";
+ break;
+
+ case "md-thin":
+ minWidth = "min-w-[85vw]";
+ height = "h-[70vh]";
+ break;
+
+ case "sm-thin":
+ minWidth = "min-w-[65vw]";
+ height = "h-[70vh]";
+ break;
+
+ case "large-h-full":
+ minWidth = "min-w-[80vw]";
+ height = "h-full";
+ break;
+ default:
+ minWidth = "min-w-[80vw]";
+ height = "h-[80vh]";
+ break;
+ }
+ return { minWidth, height };
+};
diff --git a/src/frontend/src/modals/baseModal/index.tsx b/src/frontend/src/modals/baseModal/index.tsx
index 19fc80711..a53aab173 100644
--- a/src/frontend/src/modals/baseModal/index.tsx
+++ b/src/frontend/src/modals/baseModal/index.tsx
@@ -15,8 +15,11 @@ import {
DialogContent as ModalContent,
} from "../../components/ui/dialog-with-no-close";
+import { DialogClose } from "@radix-ui/react-dialog";
+import { Button } from "../../components/ui/button";
import { modalHeaderType } from "../../types/components";
import { cn } from "../../utils/utils";
+import { switchCaseModalSize } from "./helpers/switch-case-size";
type ContentProps = { children: ReactNode };
type HeaderProps = { children: ReactNode; description: string };
@@ -61,8 +64,38 @@ const Header: React.FC<{ children: ReactNode; description: string | null }> = ({
);
};
-const Footer: React.FC<{ children: ReactNode }> = ({ children }) => {
- return <>{children}>;
+const Footer: React.FC<{
+ children?: ReactNode;
+ submit?: {
+ label: string;
+ icon?: ReactNode;
+ loading?: boolean;
+ disabled?: boolean;
+ dataTestId?: string;
+ };
+}> = ({ children, submit }) => {
+ return submit ? (
+
+ {children ??
}
+
+
+
+ Cancel
+
+
+
+ {submit.icon && submit.icon}
+ {submit.label}
+
+
+
+ ) : (
+ <>{children && children}>
+ );
};
interface BaseModalProps {
children: [
@@ -91,6 +124,7 @@ interface BaseModalProps {
disable?: boolean;
onChangeOpenModal?: (open?: boolean) => void;
type?: "modal" | "dialog";
+ onSubmit?: () => void;
}
function BaseModal({
open,
@@ -99,6 +133,7 @@ function BaseModal({
size = "large",
onChangeOpenModal,
type = "dialog",
+ onSubmit,
}: BaseModalProps) {
const headerChild = React.Children.toArray(children).find(
(child) => (child as React.ReactElement).type === Header,
@@ -113,71 +148,7 @@ function BaseModal({
(child) => (child as React.ReactElement).type === Footer,
);
- let minWidth: string;
- let height: string;
-
- switch (size) {
- case "x-small":
- minWidth = "min-w-[20vw]";
- height = "h-full";
- break;
- case "smaller":
- minWidth = "min-w-[40vw]";
- height = "h-[11rem]";
- break;
- case "smaller-h-full":
- minWidth = "min-w-[40vw]";
- height = "h-full";
- break;
- case "small":
- minWidth = "min-w-[40vw]";
- height = "h-[40vh]";
- break;
- case "small-h-full":
- minWidth = "min-w-[40vw]";
- height = "h-full";
- break;
- case "medium":
- minWidth = "min-w-[60vw]";
- height = "h-[60vh]";
- break;
- case "medium-h-full":
- minWidth = "min-w-[60vw]";
- height = "h-full";
-
- break;
- case "large":
- minWidth = "min-w-[85vw]";
- height = "h-[80vh]";
- break;
- case "three-cards":
- minWidth = "min-w-[1066px]";
- height = "h-fit";
- break;
- case "large-thin":
- minWidth = "min-w-[65vw]";
- height = "h-[80vh]";
- break;
-
- case "md-thin":
- minWidth = "min-w-[85vw]";
- height = "h-[70vh]";
- break;
-
- case "sm-thin":
- minWidth = "min-w-[65vw]";
- height = "h-[70vh]";
- break;
-
- case "large-h-full":
- minWidth = "min-w-[80vw]";
- height = "h-full";
- break;
- default:
- minWidth = "min-w-[80vw]";
- height = "h-[80vh]";
- break;
- }
+ let { minWidth, height } = switchCaseModalSize(size);
useEffect(() => {
if (onChangeOpenModal) {
@@ -212,13 +183,34 @@ function BaseModal({
{headerChild}
-
- {ContentChild}
-
- {ContentFooter && (
- {ContentFooter}
+ {onSubmit ? (
+
+ ) : (
+ <>
+
+ {ContentChild}
+
+ {ContentFooter && (
+ {ContentFooter}
+ )}
+ >
)}
diff --git a/src/frontend/src/modals/deleteConfirmationModal/index.tsx b/src/frontend/src/modals/deleteConfirmationModal/index.tsx
index d03a0d707..a0eb1200f 100644
--- a/src/frontend/src/modals/deleteConfirmationModal/index.tsx
+++ b/src/frontend/src/modals/deleteConfirmationModal/index.tsx
@@ -59,7 +59,7 @@ export default function DeleteConfirmationModal({
e.stopPropagation()}
- className="mr-3"
+ className="mr-1"
variant="outline"
>
Cancel
diff --git a/src/frontend/src/modals/editNodeModal/index.tsx b/src/frontend/src/modals/editNodeModal/index.tsx
index 775f00f03..519dba852 100644
--- a/src/frontend/src/modals/editNodeModal/index.tsx
+++ b/src/frontend/src/modals/editNodeModal/index.tsx
@@ -15,7 +15,6 @@ import ShadTooltip from "../../components/shadTooltipComponent";
import TextAreaComponent from "../../components/textAreaComponent";
import ToggleShadComponent from "../../components/toggleShadComponent";
import { Badge } from "../../components/ui/badge";
-import { Button } from "../../components/ui/button";
import {
Table,
TableBody,
@@ -102,6 +101,16 @@ const EditNodeModal = forwardRef(
onChangeOpenModal={(open) => {
setMyData(data);
}}
+ onSubmit={() => {
+ setNode(data.id, (old) => ({
+ ...old,
+ data: {
+ ...old.data,
+ node: myData.node,
+ },
+ }));
+ setOpen(false);
+ }}
>
<>>
@@ -134,6 +143,7 @@ const EditNodeModal = forwardRef(
PARAM
+ DESC
VALUE
@@ -188,11 +198,17 @@ const EditNodeModal = forwardRef(
>
@@ -205,6 +221,20 @@ const EditNodeModal = forwardRef(
+
+
+
+ {data.node?.template[templateParam]?.info ??
+ ""}
+
+
+
-
- {
- setNode(data.id, (old) => ({
- ...old,
- data: {
- ...old.data,
- node: myData.node,
- },
- }));
- setOpen(false);
- }}
- type="submit"
- >
- Save Changes
-
-
+
);
}
diff --git a/src/frontend/src/modals/exportModal/index.tsx b/src/frontend/src/modals/exportModal/index.tsx
index 9eca267e8..8ebc3f863 100644
--- a/src/frontend/src/modals/exportModal/index.tsx
+++ b/src/frontend/src/modals/exportModal/index.tsx
@@ -1,7 +1,6 @@
import { ReactNode, forwardRef, useEffect, useState } from "react";
import EditFlowSettings from "../../components/editFlowSettingsComponent";
import IconComponent from "../../components/genericIconComponent";
-import { Button } from "../../components/ui/button";
import { Checkbox } from "../../components/ui/checkbox";
import { API_WARNING_NOTICE_ALERT } from "../../constants/alerts_constants";
import {
@@ -19,7 +18,7 @@ const ExportModal = forwardRef(
(props: { children: ReactNode }, ref): JSX.Element => {
const version = useDarkStore((state) => state.version);
const setNoticeData = useAlertStore((state) => state.setNoticeData);
- const [checked, setChecked] = useState(true);
+ const [checked, setChecked] = useState(false);
const currentFlow = useFlowsManagerStore((state) => state.currentFlow);
useEffect(() => {
setName(currentFlow!.name);
@@ -30,7 +29,43 @@ const ExportModal = forwardRef(
const [open, setOpen] = useState(false);
return (
-
+ {
+ if (checked) {
+ downloadFlow(
+ {
+ id: currentFlow!.id,
+ data: currentFlow!.data!,
+ description,
+ name,
+ last_tested_version: version,
+ is_component: false,
+ },
+ name!,
+ description,
+ );
+ setNoticeData({
+ title: API_WARNING_NOTICE_ALERT,
+ });
+ } else
+ downloadFlow(
+ removeApiKeys({
+ id: currentFlow!.id,
+ data: currentFlow!.data!,
+ description,
+ name,
+ last_tested_version: version,
+ is_component: false,
+ }),
+ name!,
+ description,
+ );
+ setOpen(false);
+ }}
+ >
{props.children}
Export
@@ -64,47 +99,9 @@ const ExportModal = forwardRef(
-
- {
- if (checked) {
- downloadFlow(
- {
- id: currentFlow!.id,
- data: currentFlow!.data!,
- description,
- name,
- last_tested_version: version,
- is_component: false,
- },
- name!,
- description
- );
- setNoticeData({
- title: API_WARNING_NOTICE_ALERT,
- });
- } else
- downloadFlow(
- removeApiKeys({
- id: currentFlow!.id,
- data: currentFlow!.data!,
- description,
- name,
- last_tested_version: version,
- is_component: false,
- }),
- name!,
- description
- );
- setOpen(false);
- }}
- type="submit"
- >
- Download Flow
-
-
+
);
- }
+ },
);
export default ExportModal;
diff --git a/src/frontend/src/modals/flowLogsModal/index.tsx b/src/frontend/src/modals/flowLogsModal/index.tsx
index 6f2a09633..d7c9625b7 100644
--- a/src/frontend/src/modals/flowLogsModal/index.tsx
+++ b/src/frontend/src/modals/flowLogsModal/index.tsx
@@ -16,41 +16,15 @@ export default function FlowLogsModal({
open,
setOpen,
}: FlowSettingsPropsType): JSX.Element {
- const saveFlow = useFlowsManagerStore((state) => state.saveFlow);
const nodes = useFlowStore((state) => state.nodes);
- const currentFlow = useFlowsManagerStore((state) => state.currentFlow);
const currentFlowId = useFlowsManagerStore((state) => state.currentFlowId);
- const flows = useFlowsManagerStore((state) => state.flows);
const setNoticeData = useAlertStore((state) => state.setNoticeData);
- useEffect(() => {
- setName(currentFlow!.name);
- setDescription(currentFlow!.description);
- }, [currentFlow!.name, currentFlow!.description, open]);
-
- const [name, setName] = useState(currentFlow!.name);
- const [description, setDescription] = useState(currentFlow!.description);
const [columns, setColumns] = useState>([]);
const [rows, setRows] = useState([]);
const [activeTab, setActiveTab] = useState("Executions");
const noticed = useRef(false);
- function handleClick(): void {
- currentFlow!.name = name;
- currentFlow!.description = description;
- saveFlow(currentFlow!)
- ?.then(() => {
- setOpen(false);
- })
- .catch((err) => {
- useAlertStore.getState().setErrorData({
- title: "Error while saving changes",
- list: [(err as AxiosError).response?.data.detail ?? ""],
- });
- console.error(err);
- });
- }
-
useEffect(() => {
if (activeTab === "Executions") {
getTransactionTable(currentFlowId, "union").then((data) => {
@@ -72,7 +46,7 @@ export default function FlowLogsModal({
.some((template) => template["stream"] && template["stream"].value);
console.log(
haStream,
- nodes.map((nodes) => (nodes.data as NodeDataType).node!.template)
+ nodes.map((nodes) => (nodes.data as NodeDataType).node!.template),
);
if (haStream) {
setNoticeData({
@@ -86,16 +60,6 @@ export default function FlowLogsModal({
}
}, [open, activeTab]);
- const [nameLists, setNameList] = useState([]);
-
- useEffect(() => {
- const tempNameList: string[] = [];
- flows.forEach((flow: FlowType) => {
- if ((flow.is_component ?? false) === false) tempNameList.push(flow.name);
- });
- setNameList(tempNameList.filter((name) => name !== currentFlow!.name));
- }, [flows]);
-
return (
diff --git a/src/frontend/src/modals/flowSettingsModal/index.tsx b/src/frontend/src/modals/flowSettingsModal/index.tsx
index ea4ad09cb..022bd9aa0 100644
--- a/src/frontend/src/modals/flowSettingsModal/index.tsx
+++ b/src/frontend/src/modals/flowSettingsModal/index.tsx
@@ -1,7 +1,6 @@
import { useEffect, useState } from "react";
import EditFlowSettings from "../../components/editFlowSettingsComponent";
import IconComponent from "../../components/genericIconComponent";
-import { Button } from "../../components/ui/button";
import { SETTINGS_DIALOG_SUBTITLE } from "../../constants/constants";
import useAlertStore from "../../stores/alertStore";
import useFlowsManagerStore from "../../stores/flowsManagerStore";
@@ -24,19 +23,21 @@ export default function FlowSettingsModal({
const [name, setName] = useState(currentFlow!.name);
const [description, setDescription] = useState(currentFlow!.description);
const [endpoint_name, setEndpointName] = useState(currentFlow!.endpoint_name);
-
+ const [isSaving, setIsSaving] = useState(false);
function handleClick(): void {
+ setIsSaving(true);
currentFlow!.name = name;
currentFlow!.description = description;
currentFlow!.endpoint_name = endpoint_name;
saveFlow(currentFlow!)
?.then(() => {
setOpen(false);
+ setIsSaving(false);
})
.catch((err) => {
useAlertStore.getState().setErrorData({
title: "Error while saving changes",
- list: [(err as AxiosError).response?.data.detail ?? ""],
+ list: [err?.response?.data.detail ?? ""],
});
console.error(err);
});
@@ -53,7 +54,12 @@ export default function FlowSettingsModal({
}, [flows]);
return (
-
+
Settings
@@ -70,15 +76,14 @@ export default function FlowSettingsModal({
/>
-
-
- Save
-
-
+
);
}
diff --git a/src/frontend/src/modals/genericModal/index.tsx b/src/frontend/src/modals/genericModal/index.tsx
index 369ccbb96..c842b38d7 100644
--- a/src/frontend/src/modals/genericModal/index.tsx
+++ b/src/frontend/src/modals/genericModal/index.tsx
@@ -83,7 +83,7 @@ export default function GenericModal({
}
const filteredWordsHighlight = matches.filter(
- (word) => !invalid_chars.includes(word)
+ (word) => !invalid_chars.includes(word),
);
setWordsHighlight(filteredWordsHighlight);
@@ -134,7 +134,7 @@ export default function GenericModal({
// to the first key of the custom_fields object
if (field_name === "") {
field_name = Array.isArray(
- apiReturn.data?.frontend_node?.custom_fields?.[""]
+ apiReturn.data?.frontend_node?.custom_fields?.[""],
)
? apiReturn.data?.frontend_node?.custom_fields?.[""][0] ?? ""
: apiReturn.data?.frontend_node?.custom_fields?.[""] ?? "";
@@ -209,7 +209,7 @@ export default function GenericModal({
{type === TypeModal.PROMPT && isEdit && !readonly ? (
diff --git a/src/frontend/src/modals/shareModal/index.tsx b/src/frontend/src/modals/shareModal/index.tsx
index a2c2f88d0..bde63de7e 100644
--- a/src/frontend/src/modals/shareModal/index.tsx
+++ b/src/frontend/src/modals/shareModal/index.tsx
@@ -1,4 +1,3 @@
-import { Loader2 } from "lucide-react";
import { ReactNode, useEffect, useMemo, useState } from "react";
import EditFlowSettings from "../../components/editFlowSettingsComponent";
import IconComponent from "../../components/genericIconComponent";
@@ -202,6 +201,18 @@ export default function ShareModal({
size="smaller-h-full"
open={(!disabled && open) ?? internalOpen}
setOpen={setOpen ?? internalSetOpen}
+ onSubmit={() => {
+ const isNameAvailable = !unavaliableNames.some(
+ (element) => element.name === name,
+ );
+
+ if (isNameAvailable) {
+ handleShareComponent();
+ (setOpen || internalSetOpen)(false);
+ } else {
+ setOpenConfirmationModal(true);
+ }
+ }}
>
{children ? children : <>>}
@@ -250,8 +261,13 @@ export default function ShareModal({
-
-
+
+ <>
{!is_component && (
)}
- {
- const isNameAvailable = !unavaliableNames.some(
- (element) => element.name === name,
- );
-
- if (isNameAvailable) {
- handleShareComponent();
- (setOpen || internalSetOpen)(false);
- } else {
- setOpenConfirmationModal(true);
- }
- }}
- >
- {loadingNames ? (
- <>
-
-
-
- >
- ) : (
- <>
- Share{" "}
- {!loadingNames && (!is_component ? "Flow" : "Component")}
- >
- )}
-
-
+ >
<>{modalConfirmation}>
diff --git a/src/frontend/src/modals/storeApiKeyModal/index.tsx b/src/frontend/src/modals/storeApiKeyModal/index.tsx
deleted file mode 100644
index cdd2419a5..000000000
--- a/src/frontend/src/modals/storeApiKeyModal/index.tsx
+++ /dev/null
@@ -1,145 +0,0 @@
-import * as Form from "@radix-ui/react-form";
-import { useContext, useState } from "react";
-import IconComponent from "../../components/genericIconComponent";
-import { Button } from "../../components/ui/button";
-import { Input } from "../../components/ui/input";
-import {
- API_ERROR_ALERT,
- API_SUCCESS_ALERT,
-} from "../../constants/alerts_constants";
-import {
- CREATE_API_KEY,
- INSERT_API_KEY,
- INVALID_API_KEY,
- NO_API_KEY,
-} from "../../constants/constants";
-import { AuthContext } from "../../contexts/authContext";
-import { addApiKeyStore } from "../../controllers/API";
-import useAlertStore from "../../stores/alertStore";
-import { useStoreStore } from "../../stores/storeStore";
-import { StoreApiKeyType } from "../../types/components";
-import BaseModal from "../baseModal";
-
-export default function StoreApiKeyModal({
- children,
- disabled = false,
-}: StoreApiKeyType) {
- if (disabled) return <>{children}>;
- const [open, setOpen] = useState(false);
- const setSuccessData = useAlertStore((state) => state.setSuccessData);
- const setErrorData = useAlertStore((state) => state.setErrorData);
- const { storeApiKey } = useContext(AuthContext);
- const [apiKeyValue, setApiKeyValue] = useState("");
-
- const validApiKey = useStoreStore((state) => state.validApiKey);
- const hasApiKey = useStoreStore((state) => state.hasApiKey);
- const setHasApiKey = useStoreStore((state) => state.updateHasApiKey);
- const setValidApiKey = useStoreStore((state) => state.updateValidApiKey);
- const setLoadingApiKey = useStoreStore((state) => state.updateLoadingApiKey);
-
- const handleSaveKey = () => {
- if (apiKeyValue) {
- addApiKeyStore(apiKeyValue).then(
- () => {
- setSuccessData({
- title: API_SUCCESS_ALERT,
- });
- storeApiKey(apiKeyValue);
- setOpen(false);
- setHasApiKey(true);
- setValidApiKey(true);
- setLoadingApiKey(false);
- },
- (error) => {
- setErrorData({
- title: API_ERROR_ALERT,
- list: [error["response"]["data"]["detail"]],
- });
- setHasApiKey(false);
- setValidApiKey(false);
- setLoadingApiKey(false);
- }
- );
- }
- };
-
- return (
-
- {children}
-
- API Key
-
-
-
- {
- event.preventDefault();
- handleSaveKey();
- }}
- >
-
-
-
-
- {
- setApiKeyValue(value);
- }}
- placeholder="Insert your API Key"
- />
-
-
-
-
-
-
- {CREATE_API_KEY}{" "}
-
- langflow.store
-
-
-
- {
- setOpen(false);
- }}
- >
- Cancel
-
-
-
-
- Save
-
-
-
-
-
-
-
- );
-}
diff --git a/src/frontend/src/pages/FlowPage/components/nodeToolbarComponent/index.tsx b/src/frontend/src/pages/FlowPage/components/nodeToolbarComponent/index.tsx
index 587dc6d73..045697e3a 100644
--- a/src/frontend/src/pages/FlowPage/components/nodeToolbarComponent/index.tsx
+++ b/src/frontend/src/pages/FlowPage/components/nodeToolbarComponent/index.tsx
@@ -58,7 +58,7 @@ export default function NodeToolbarComponent({
data.node.template[templateField].type === "Any" ||
data.node.template[templateField].type === "int" ||
data.node.template[templateField].type === "dict" ||
- data.node.template[templateField].type === "NestedDict"),
+ data.node.template[templateField].type === "NestedDict")
).length;
const templates = useTypesStore((state) => state.templates);
const hasStore = useStoreStore((state) => state.hasStore);
@@ -85,7 +85,7 @@ export default function NodeToolbarComponent({
const [showconfirmShare, setShowconfirmShare] = useState(false);
const [showOverrideModal, setShowOverrideModal] = useState(false);
const [flowComponent, setFlowComponent] = useState(
- createFlowComponent(cloneDeep(data), version),
+ createFlowComponent(cloneDeep(data), version)
);
const openInNewTab = (url) => {
@@ -100,7 +100,7 @@ export default function NodeToolbarComponent({
const updateNodeInternals = useUpdateNodeInternals();
const setLastCopiedSelection = useFlowStore(
- (state) => state.setLastCopiedSelection,
+ (state) => state.setLastCopiedSelection
);
const setSuccessData = useAlertStore((state) => state.setSuccessData);
@@ -141,6 +141,9 @@ export default function NodeToolbarComponent({
break;
case "disabled":
break;
+ case "unselect":
+ unselectAll();
+ break;
case "ungroup":
takeSnapshot();
expandGroupNode(
@@ -150,7 +153,7 @@ export default function NodeToolbarComponent({
nodes,
edges,
setNodes,
- setEdges,
+ setEdges
);
break;
case "override":
@@ -174,16 +177,16 @@ export default function NodeToolbarComponent({
y: 10,
paneX: nodes.find((node) => node.id === data.id)?.position.x,
paneY: nodes.find((node) => node.id === data.id)?.position.y,
- },
+ }
);
break;
case "update":
takeSnapshot();
// to update we must get the code from the templates in useTypesStore
- const thisNodeTemplate = templates[data.type].template;
+ const thisNodeTemplate = templates[data.type]?.template;
// if the template does not have a code key
// return
- if (!thisNodeTemplate.code) return;
+ if (!thisNodeTemplate?.code) return;
const currentCode = thisNodeTemplate.code.value;
if (data.node) {
@@ -212,13 +215,13 @@ export default function NodeToolbarComponent({
};
const isSaved = flows.some((flow) =>
- Object.values(flow).includes(data.node?.display_name!),
+ Object.values(flow).includes(data.node?.display_name!)
);
const setNode = useFlowStore((state) => state.setNode);
const handleOnNewValue = (
- newValue: string | string[] | boolean | Object[],
+ newValue: string | string[] | boolean | Object[]
): void => {
if (data.node!.template[name].value !== newValue) {
takeSnapshot();
@@ -276,6 +279,10 @@ export default function NodeToolbarComponent({
event.preventDefault();
handleSelectChange("update");
}
+ if (selected && event.key.toUpperCase() === "ESCAPE") {
+ event.preventDefault();
+ handleSelectChange("unselect");
+ }
if (
selected &&
isGroup &&
@@ -380,7 +387,7 @@ export default function NodeToolbarComponent({
return (
<>
-
+
{hasCode && (
@@ -401,7 +408,7 @@ export default function NodeToolbarComponent({
data-testid="save-button-modal"
className={classNames(
"relative -ml-px inline-flex items-center bg-background px-2 py-2 text-foreground shadow-md ring-1 ring-inset ring-ring transition-all duration-500 ease-in-out hover:bg-muted focus:z-10",
- hasCode ? " " : " rounded-l-md ",
+ hasCode ? " " : " rounded-l-md "
)}
onClick={(event) => {
event.preventDefault();
@@ -419,7 +426,7 @@ export default function NodeToolbarComponent({
{
event.preventDefault();
@@ -467,7 +474,7 @@ export default function NodeToolbarComponent({
state.myCollectionId);
const getFoldersApi = useFolderStore((state) => state.getFoldersApi);
const setFolderUrl = useFolderStore((state) => state.setFolderUrl);
+ const addFlow = useFlowsManagerStore((state) => state.addFlow);
useEffect(() => {
setFolderUrl(folderId ?? "");
@@ -115,7 +116,7 @@ export default function ComponentsComponent({
});
};
- const handleSelectOptionsChange = () => {
+ const handleSelectOptionsChange = (action: string) => {
const hasSelected = selectedFlowsComponentsCards?.length > 0;
if (!hasSelected) {
setErrorData({
@@ -124,7 +125,31 @@ export default function ComponentsComponent({
});
return;
}
- setOpenDelete(true);
+ if (action === "delete") {
+ setOpenDelete(true);
+ } else if (action === "duplicate") {
+ handleDuplicate();
+ }
+ };
+
+ const handleDuplicate = () => {
+ Promise.all(
+ selectedFlowsComponentsCards.map((selectedFlow) =>
+ addFlow(
+ true,
+ allFlows.find((flow) => flow.id === selectedFlow),
+ ),
+ ),
+ ).then(() => {
+ resetFilter();
+ getFoldersApi(true);
+ if (!folderId || folderId === myCollectionId) {
+ getFolderById(folderId ? folderId : myCollectionId);
+ }
+ setSelectedFlowsComponentsCards([]);
+
+ setSuccessData({ title: "Flows duplicated successfully" });
+ });
};
const handleDeleteMultiple = () => {
@@ -198,9 +223,10 @@ export default function ComponentsComponent({
<>
{allFlows?.length > 0 && (
handleSelectOptionsChange("delete")}
handleSelectAll={handleSelectAll}
- disableDelete={!(selectedFlowsComponentsCards?.length > 0)}
+ handleDuplicate={() => handleSelectOptionsChange("duplicate")}
+ disableFunctions={!(selectedFlowsComponentsCards?.length > 0)}
/>
)}
diff --git a/src/frontend/src/pages/MainPage/components/emptyComponent/index.tsx b/src/frontend/src/pages/MainPage/components/emptyComponent/index.tsx
index 0655b6f47..fdcc7291a 100644
--- a/src/frontend/src/pages/MainPage/components/emptyComponent/index.tsx
+++ b/src/frontend/src/pages/MainPage/components/emptyComponent/index.tsx
@@ -1,5 +1,7 @@
import { useNavigate } from "react-router-dom";
import useFlowsManagerStore from "../../../../stores/flowsManagerStore";
+import NewFlowModal from "../../../../modals/newFlowModal";
+import { useState } from "react";
type EmptyComponentProps = {};
@@ -7,31 +9,28 @@ const EmptyComponent = ({}: EmptyComponentProps) => {
const addFlow = useFlowsManagerStore((state) => state.addFlow);
const navigate = useNavigate();
+ const [openModal, setOpenModal] = useState(false);
+
return (
<>
-
-
- Flows and components can be created using Langflow.
-
-
- New?
-
- {
- addFlow(true).then((id) => {
- navigate("/flow/" + id);
- });
- }}
- className="underline"
- >
- Start Here
-
- .
-
- 🚀
-
+
+
+ This folder is empty. New?
+
+
+
+ {
+ setOpenModal(true);
+ }}
+ className="underline"
+ >
+ Start Here
+
+
+ 🚀
diff --git a/src/frontend/src/pages/MainPage/components/headerComponent/index.tsx b/src/frontend/src/pages/MainPage/components/headerComponent/index.tsx
index 5b556f162..212816ea4 100644
--- a/src/frontend/src/pages/MainPage/components/headerComponent/index.tsx
+++ b/src/frontend/src/pages/MainPage/components/headerComponent/index.tsx
@@ -16,13 +16,15 @@ import ShadTooltip from "../../../../components/shadTooltipComponent";
type HeaderComponentProps = {
handleSelectAll: (select) => void;
handleDelete: () => void;
- disableDelete: boolean;
+ handleDuplicate: () => void;
+ disableFunctions: boolean;
};
const HeaderComponent = ({
handleSelectAll,
handleDelete,
- disableDelete,
+ handleDuplicate,
+ disableFunctions,
}: HeaderComponentProps) => {
const [shouldSelectAll, setShouldSelectAll] = useState(true);
@@ -55,23 +57,41 @@ const HeaderComponent = ({
-
+
Select items to duplicate
+ ) : (
+ Duplicate selected items
+ )
+ }
+ >
+
+
+
+
+
+
+
Select items to delete
) : (
Delete selected items
)
}
>
-
+
diff --git a/src/frontend/src/pages/Playground/index.tsx b/src/frontend/src/pages/Playground/index.tsx
index d26147df6..4ea3cc506 100644
--- a/src/frontend/src/pages/Playground/index.tsx
+++ b/src/frontend/src/pages/Playground/index.tsx
@@ -11,7 +11,7 @@ export default function PlaygroundPage() {
const currentFlow = useFlowsManagerStore((state) => state.currentFlow);
const getFlowById = useFlowsManagerStore((state) => state.getFlowById);
const setCurrentFlowId = useFlowsManagerStore(
- (state) => state.setCurrentFlowId
+ (state) => state.setCurrentFlowId,
);
const currentFlowId = useFlowsManagerStore((state) => state.currentFlowId);
const setCurrentFlow = useFlowsManagerStore((state) => state.setCurrentFlow);
diff --git a/src/frontend/src/pages/SettingsPage/index.tsx b/src/frontend/src/pages/SettingsPage/index.tsx
index e4f856599..0abab2aec 100644
--- a/src/frontend/src/pages/SettingsPage/index.tsx
+++ b/src/frontend/src/pages/SettingsPage/index.tsx
@@ -21,7 +21,7 @@ export default function SettingsPage(): JSX.Element {
icon: (
),
},
@@ -32,7 +32,7 @@ export default function SettingsPage(): JSX.Element {
icon: (
),
},
@@ -40,7 +40,10 @@ export default function SettingsPage(): JSX.Element {
title: "Shortcuts",
href: "/settings/shortcuts",
icon: (
-
+
),
},
];
@@ -50,10 +53,10 @@ export default function SettingsPage(): JSX.Element {
description="Manage the general settings for Langflow."
>
-
+
-
diff --git a/src/frontend/src/pages/SettingsPage/pages/GeneralPage/components/GeneralPageHeader/index.tsx b/src/frontend/src/pages/SettingsPage/pages/GeneralPage/components/GeneralPageHeader/index.tsx
new file mode 100644
index 000000000..cb65942bd
--- /dev/null
+++ b/src/frontend/src/pages/SettingsPage/pages/GeneralPage/components/GeneralPageHeader/index.tsx
@@ -0,0 +1,23 @@
+import ForwardedIconComponent from "../../../../../../components/genericIconComponent";
+
+const GeneralPageHeaderComponent = () => {
+ return (
+ <>
+
+
+
+ General
+
+
+
+ Manage settings related to Langflow and your account.
+
+
+
+ >
+ );
+};
+export default GeneralPageHeaderComponent;
diff --git a/src/frontend/src/pages/SettingsPage/pages/GeneralPage/components/PasswordForm/index.tsx b/src/frontend/src/pages/SettingsPage/pages/GeneralPage/components/PasswordForm/index.tsx
new file mode 100644
index 000000000..28edb2f09
--- /dev/null
+++ b/src/frontend/src/pages/SettingsPage/pages/GeneralPage/components/PasswordForm/index.tsx
@@ -0,0 +1,93 @@
+import * as Form from "@radix-ui/react-form";
+import InputComponent from "../../../../../../components/inputComponent";
+import { Button } from "../../../../../../components/ui/button";
+import {
+ Card,
+ CardContent,
+ CardDescription,
+ CardFooter,
+ CardHeader,
+ CardTitle,
+} from "../../../../../../components/ui/card";
+
+type PasswordFormComponentProps = {
+ password: string;
+ cnfPassword: string;
+ handleInput: (event: any) => void;
+ handlePatchPassword: (
+ password: string,
+ cnfPassword: string,
+ handleInput: any,
+ ) => void;
+};
+const PasswordFormComponent = ({
+ password,
+ cnfPassword,
+ handleInput,
+ handlePatchPassword,
+}: PasswordFormComponentProps) => {
+ return (
+ <>
+ {
+ handlePatchPassword(password, cnfPassword, handleInput);
+ event.preventDefault();
+ }}
+ >
+
+
+ Password
+
+ Type your new password and confirm it.
+
+
+
+
+
+ {
+ handleInput({ target: { name: "password", value } });
+ }}
+ value={password}
+ isForm
+ password={true}
+ placeholder="Password"
+ className="w-full"
+ />
+
+ Please enter your password
+
+
+
+ {
+ handleInput({
+ target: { name: "cnfPassword", value },
+ });
+ }}
+ value={cnfPassword}
+ isForm
+ password={true}
+ placeholder="Confirm Password"
+ className="w-full"
+ />
+
+
+ Please confirm your password
+
+
+
+
+
+
+ Save
+
+
+
+
+ >
+ );
+};
+export default PasswordFormComponent;
diff --git a/src/frontend/src/pages/SettingsPage/pages/GeneralPage/components/ProfileGradientForm/index.tsx b/src/frontend/src/pages/SettingsPage/pages/GeneralPage/components/ProfileGradientForm/index.tsx
new file mode 100644
index 000000000..a92a00103
--- /dev/null
+++ b/src/frontend/src/pages/SettingsPage/pages/GeneralPage/components/ProfileGradientForm/index.tsx
@@ -0,0 +1,68 @@
+import * as Form from "@radix-ui/react-form";
+import GradientChooserComponent from "../../../../../../components/gradientChooserComponent";
+import { Button } from "../../../../../../components/ui/button";
+import {
+ Card,
+ CardContent,
+ CardDescription,
+ CardFooter,
+ CardHeader,
+ CardTitle,
+} from "../../../../../../components/ui/card";
+import { gradients } from "../../../../../../utils/styleUtils";
+
+type ProfileGradientFormComponentProps = {
+ gradient: string;
+ handleInput: (event: any) => void;
+ handlePatchGradient: (gradient: string) => void;
+ userData: any;
+};
+const ProfileGradientFormComponent = ({
+ gradient,
+ handleInput,
+ handlePatchGradient,
+ userData,
+}: ProfileGradientFormComponentProps) => {
+ return (
+ <>
+ {
+ handlePatchGradient(gradient);
+ event.preventDefault();
+ }}
+ >
+
+
+ Profile Gradient
+
+ Choose the gradient that appears as your profile picture.
+
+
+
+
+ {
+ handleInput({ target: { name: "gradient", value } });
+ }}
+ />
+
+
+
+
+ Save
+
+
+
+
+ >
+ );
+};
+export default ProfileGradientFormComponent;
diff --git a/src/frontend/src/pages/SettingsPage/pages/GeneralPage/components/StoreApiKeyForm/index.tsx b/src/frontend/src/pages/SettingsPage/pages/GeneralPage/components/StoreApiKeyForm/index.tsx
new file mode 100644
index 000000000..5bc5ea38b
--- /dev/null
+++ b/src/frontend/src/pages/SettingsPage/pages/GeneralPage/components/StoreApiKeyForm/index.tsx
@@ -0,0 +1,102 @@
+import * as Form from "@radix-ui/react-form";
+import InputComponent from "../../../../../../components/inputComponent";
+import { Button } from "../../../../../../components/ui/button";
+import {
+ Card,
+ CardContent,
+ CardDescription,
+ CardFooter,
+ CardHeader,
+ CardTitle,
+} from "../../../../../../components/ui/card";
+import {
+ CREATE_API_KEY,
+ INSERT_API_KEY,
+ INVALID_API_KEY,
+ NO_API_KEY,
+} from "../../../../../../constants/constants";
+
+type StoreApiKeyFormComponentProps = {
+ apikey: string;
+ handleInput: (event: any) => void;
+ handleSaveKey: (apikey: string, handleInput: any) => void;
+ loadingApiKey: boolean;
+ validApiKey: boolean;
+ hasApiKey: boolean;
+};
+const StoreApiKeyFormComponent = ({
+ apikey,
+ handleInput,
+ handleSaveKey,
+ loadingApiKey,
+ validApiKey,
+ hasApiKey,
+}: StoreApiKeyFormComponentProps) => {
+ return (
+ <>
+ {
+ event.preventDefault();
+ handleSaveKey(apikey, handleInput);
+ }}
+ >
+
+
+ Store API Key
+
+ {(hasApiKey && !validApiKey
+ ? INVALID_API_KEY
+ : !hasApiKey
+ ? NO_API_KEY
+ : "") + INSERT_API_KEY}
+
+
+
+
+
+
+ {
+ handleInput({ target: { name: "apikey", value } });
+ }}
+ value={apikey}
+ isForm
+ password={true}
+ placeholder="Insert your API Key"
+ className="w-full"
+ />
+
+ Please enter your API Key
+
+
+
+
+ {CREATE_API_KEY}{" "}
+
+ langflow.store
+
+
+
+
+
+
+
+ Save
+
+
+
+
+
+ >
+ );
+};
+export default StoreApiKeyFormComponent;
diff --git a/src/frontend/src/pages/SettingsPage/pages/GeneralPage/index.tsx b/src/frontend/src/pages/SettingsPage/pages/GeneralPage/index.tsx
index 5c53e0aa8..718097152 100644
--- a/src/frontend/src/pages/SettingsPage/pages/GeneralPage/index.tsx
+++ b/src/frontend/src/pages/SettingsPage/pages/GeneralPage/index.tsx
@@ -1,221 +1,107 @@
-import * as Form from "@radix-ui/react-form";
-import { cloneDeep } from "lodash";
-import { useContext, useEffect, useState } from "react";
-import ForwardedIconComponent from "../../../../components/genericIconComponent";
-import GradientChooserComponent from "../../../../components/gradientChooserComponent";
-import InputComponent from "../../../../components/inputComponent";
-import { Button } from "../../../../components/ui/button";
-import {
- Card,
- CardContent,
- CardDescription,
- CardFooter,
- CardHeader,
- CardTitle,
-} from "../../../../components/ui/card";
-import {
- EDIT_PASSWORD_ALERT_LIST,
- EDIT_PASSWORD_ERROR_ALERT,
- SAVE_ERROR_ALERT,
- SAVE_SUCCESS_ALERT,
-} from "../../../../constants/alerts_constants";
+import { useContext, useState } from "react";
+import { useParams } from "react-router-dom";
import { CONTROL_PATCH_USER_STATE } from "../../../../constants/constants";
import { AuthContext } from "../../../../contexts/authContext";
-import { resetPassword, updateUser } from "../../../../controllers/API";
import useAlertStore from "../../../../stores/alertStore";
import useFlowsManagerStore from "../../../../stores/flowsManagerStore";
+import { useStoreStore } from "../../../../stores/storeStore";
import {
inputHandlerEventType,
patchUserInputStateType,
} from "../../../../types/components";
-import { gradients } from "../../../../utils/styleUtils";
+import usePatchGradient from "../hooks/use-patch-gradient";
+import usePatchPassword from "../hooks/use-patch-password";
+import useSaveKey from "../hooks/use-save-key";
+import useScrollToElement from "../hooks/use-scroll-to-element";
+import GeneralPageHeaderComponent from "./components/GeneralPageHeader";
+import PasswordFormComponent from "./components/PasswordForm";
+import ProfileGradientFormComponent from "./components/ProfileGradientForm";
+import StoreApiKeyFormComponent from "./components/StoreApiKeyForm";
export default function GeneralPage() {
const setCurrentFlowId = useFlowsManagerStore(
(state) => state.setCurrentFlowId,
);
+ const { scrollId } = useParams();
+
const [inputState, setInputState] = useState(
CONTROL_PATCH_USER_STATE,
);
const { autoLogin } = useContext(AuthContext);
- // set null id
- useEffect(() => {
- setCurrentFlowId("");
- }, []);
const setSuccessData = useAlertStore((state) => state.setSuccessData);
const setErrorData = useAlertStore((state) => state.setErrorData);
const { userData, setUserData } = useContext(AuthContext);
- const { password, cnfPassword, gradient } = inputState;
+ const hasStore = useStoreStore((state) => state.hasStore);
- async function handlePatchPassword() {
- if (password !== cnfPassword) {
- setErrorData({
- title: EDIT_PASSWORD_ERROR_ALERT,
- list: [EDIT_PASSWORD_ALERT_LIST],
- });
- return;
- }
- try {
- if (password !== "") await resetPassword(userData!.id, { password });
- handleInput({ target: { name: "password", value: "" } });
- handleInput({ target: { name: "cnfPassword", value: "" } });
- setSuccessData({ title: SAVE_SUCCESS_ALERT });
- } catch (error) {
- setErrorData({
- title: SAVE_ERROR_ALERT,
- list: [(error as any).response.data.detail],
- });
- }
- }
+ const validApiKey = useStoreStore((state) => state.validApiKey);
+ const hasApiKey = useStoreStore((state) => state.hasApiKey);
+ const setHasApiKey = useStoreStore((state) => state.updateHasApiKey);
+ const loadingApiKey = useStoreStore((state) => state.loadingApiKey);
+ const setValidApiKey = useStoreStore((state) => state.updateValidApiKey);
+ const setLoadingApiKey = useStoreStore((state) => state.updateLoadingApiKey);
+ const { password, cnfPassword, gradient, apikey } = inputState;
- async function handlePatchGradient() {
- try {
- if (gradient !== "")
- await updateUser(userData!.id, { profile_image: gradient });
- if (gradient !== "") {
- let newUserData = cloneDeep(userData);
- newUserData!.profile_image = gradient;
+ const { handlePatchPassword } = usePatchPassword(
+ userData,
+ setSuccessData,
+ setErrorData,
+ );
- setUserData(newUserData);
- }
- setSuccessData({ title: SAVE_SUCCESS_ALERT });
- } catch (error) {
- setErrorData({
- title: SAVE_ERROR_ALERT,
- list: [(error as any).response.data.detail],
- });
- }
- }
+ const { handlePatchGradient } = usePatchGradient(
+ setSuccessData,
+ setErrorData,
+ userData,
+ setUserData,
+ );
+
+ useScrollToElement(scrollId, setCurrentFlowId);
+
+ const { handleSaveKey } = useSaveKey(
+ setSuccessData,
+ setErrorData,
+ setHasApiKey,
+ setValidApiKey,
+ setLoadingApiKey,
+ );
function handleInput({
target: { name, value },
}: inputHandlerEventType): void {
setInputState((prev) => ({ ...prev, [name]: value }));
}
+
return (
-
-
-
- General
-
-
-
- Manage settings related to Langflow and your account.
-
-
-
+
-
{
- handlePatchGradient();
- event.preventDefault();
- }}
- >
-
-
- Profile Gradient
-
- Choose the gradient that appears as your profile picture.
-
-
-
-
- {
- handleInput({ target: { name: "gradient", value } });
- }}
- />
-
-
-
-
- Save
-
-
-
-
- {!autoLogin && (
-
{
- handlePatchPassword();
- event.preventDefault();
- }}
- >
-
-
- Password
-
- Type your new password and confirm it.
-
-
-
-
-
- {
- handleInput({ target: { name: "password", value } });
- }}
- value={password}
- isForm
- password={true}
- placeholder="Password"
- className="w-full"
- />
-
- Please enter your password
-
-
-
- {
- handleInput({
- target: { name: "cnfPassword", value },
- });
- }}
- value={cnfPassword}
- isForm
- password={true}
- placeholder="Confirm Password"
- className="w-full"
- />
+
-
- Please confirm your password
-
-
-
-
-
-
- Save
-
-
-
-
+ {!autoLogin && (
+
+ )}
+ {hasStore && (
+
)}
diff --git a/src/frontend/src/pages/SettingsPage/pages/GlobalVariablesPage/index.tsx b/src/frontend/src/pages/SettingsPage/pages/GlobalVariablesPage/index.tsx
index 65f7ecd41..195e5d37c 100644
--- a/src/frontend/src/pages/SettingsPage/pages/GlobalVariablesPage/index.tsx
+++ b/src/frontend/src/pages/SettingsPage/pages/GlobalVariablesPage/index.tsx
@@ -170,7 +170,7 @@ export default function GlobalVariablesPage() {
-
-
+
{
+ const handlePatchGradient = async (gradient) => {
+ try {
+ if (gradient !== "") {
+ await updateUser(currentUserData.id, { profile_image: gradient });
+ let newUserData = cloneDeep(currentUserData);
+ newUserData.profile_image = gradient;
+ setUserData(newUserData);
+ }
+ setSuccessData({ title: SAVE_SUCCESS_ALERT });
+ } catch (error) {
+ setErrorData({
+ title: SAVE_ERROR_ALERT,
+ list: [(error as any)?.response?.data?.detail],
+ });
+ }
+ };
+
+ return {
+ currentUserData,
+ handlePatchGradient,
+ };
+};
+
+export default usePatchGradient;
diff --git a/src/frontend/src/pages/SettingsPage/pages/hooks/use-patch-password.tsx b/src/frontend/src/pages/SettingsPage/pages/hooks/use-patch-password.tsx
new file mode 100644
index 000000000..c4b452e6b
--- /dev/null
+++ b/src/frontend/src/pages/SettingsPage/pages/hooks/use-patch-password.tsx
@@ -0,0 +1,36 @@
+import {
+ EDIT_PASSWORD_ALERT_LIST,
+ EDIT_PASSWORD_ERROR_ALERT,
+ SAVE_ERROR_ALERT,
+ SAVE_SUCCESS_ALERT,
+} from "../../../../constants/alerts_constants";
+import { resetPassword } from "../../../../controllers/API";
+
+const usePatchPassword = (userData, setSuccessData, setErrorData) => {
+ const handlePatchPassword = async (password, cnfPassword, handleInput) => {
+ if (password !== cnfPassword) {
+ setErrorData({
+ title: EDIT_PASSWORD_ERROR_ALERT,
+ list: [EDIT_PASSWORD_ALERT_LIST],
+ });
+ return;
+ }
+ try {
+ if (password !== "") await resetPassword(userData.id, { password });
+ handleInput({ target: { name: "password", value: "" } });
+ handleInput({ target: { name: "cnfPassword", value: "" } });
+ setSuccessData({ title: SAVE_SUCCESS_ALERT });
+ } catch (error) {
+ setErrorData({
+ title: SAVE_ERROR_ALERT,
+ list: [(error as any)?.response?.data?.detail],
+ });
+ }
+ };
+
+ return {
+ handlePatchPassword,
+ };
+};
+
+export default usePatchPassword;
diff --git a/src/frontend/src/pages/SettingsPage/pages/hooks/use-save-key.tsx b/src/frontend/src/pages/SettingsPage/pages/hooks/use-save-key.tsx
new file mode 100644
index 000000000..bdd105fef
--- /dev/null
+++ b/src/frontend/src/pages/SettingsPage/pages/hooks/use-save-key.tsx
@@ -0,0 +1,48 @@
+import { useContext } from "react";
+import {
+ API_ERROR_ALERT,
+ API_SUCCESS_ALERT,
+} from "../../../../constants/alerts_constants";
+import { AuthContext } from "../../../../contexts/authContext";
+import { addApiKeyStore } from "../../../../controllers/API";
+
+const useSaveKey = (
+ setSuccessData,
+ setErrorData,
+ setHasApiKey,
+ setValidApiKey,
+ setLoadingApiKey
+) => {
+ const { storeApiKey } = useContext(AuthContext);
+
+ const handleSaveKey = (apikey, handleInput) => {
+ if (apikey) {
+ setLoadingApiKey(true);
+ addApiKeyStore(apikey).then(
+ () => {
+ setSuccessData({ title: API_SUCCESS_ALERT });
+ storeApiKey(apikey);
+ setHasApiKey(true);
+ setValidApiKey(true);
+ setLoadingApiKey(false);
+ handleInput({ target: { name: "apikey", value: "" } });
+ },
+ (error) => {
+ setErrorData({
+ title: API_ERROR_ALERT,
+ list: [error.response.data.detail],
+ });
+ setHasApiKey(false);
+ setValidApiKey(false);
+ setLoadingApiKey(false);
+ }
+ );
+ }
+ };
+
+ return {
+ handleSaveKey,
+ };
+};
+
+export default useSaveKey;
diff --git a/src/frontend/src/pages/SettingsPage/pages/hooks/use-scroll-to-element.tsx b/src/frontend/src/pages/SettingsPage/pages/hooks/use-scroll-to-element.tsx
new file mode 100644
index 000000000..a56c2d5d6
--- /dev/null
+++ b/src/frontend/src/pages/SettingsPage/pages/hooks/use-scroll-to-element.tsx
@@ -0,0 +1,17 @@
+import { useEffect } from "react";
+
+const useScrollToElement = (scrollId, setCurrentFlowId) => {
+ useEffect(() => {
+ const element = document.getElementById(scrollId ?? "null");
+ if (element) {
+ // Scroll smoothly to the top of the next section
+ element.scrollIntoView({ behavior: "smooth" });
+ }
+ }, [scrollId]);
+
+ useEffect(() => {
+ setCurrentFlowId("");
+ }, [setCurrentFlowId]);
+};
+
+export default useScrollToElement;
diff --git a/src/frontend/src/pages/StorePage/index.tsx b/src/frontend/src/pages/StorePage/index.tsx
index 3b4612281..2fc401d8a 100644
--- a/src/frontend/src/pages/StorePage/index.tsx
+++ b/src/frontend/src/pages/StorePage/index.tsx
@@ -29,7 +29,6 @@ import {
import { STORE_DESC, STORE_TITLE } from "../../constants/constants";
import { AuthContext } from "../../contexts/authContext";
import { getStoreComponents, getStoreTags } from "../../controllers/API";
-import StoreApiKeyModal from "../../modals/storeApiKeyModal";
import useAlertStore from "../../stores/alertStore";
import useFlowsManagerStore from "../../stores/flowsManagerStore";
import { useStoreStore } from "../../stores/storeStore";
@@ -180,24 +179,21 @@ export default function StorePage(): JSX.Element {
title={STORE_TITLE}
description={STORE_DESC}
button={
- <>
- {StoreApiKeyModal && (
-
-
-
- API Key
-
-
+
+ variant="primary"
+ onClick={() => {
+ navigate("/settings/general/api");
+ }}
+ >
+
+ API Key
+
}
>
diff --git a/src/frontend/src/routes.tsx b/src/frontend/src/routes.tsx
index 21df212fd..25f63082b 100644
--- a/src/frontend/src/routes.tsx
+++ b/src/frontend/src/routes.tsx
@@ -76,7 +76,7 @@ const Router = () => {
>
} />
} />
-
} />
+
} />
} />
((set, get) => ({
startNodeId,
stopNodeId,
input_value,
+ silent,
}: {
startNodeId?: string;
stopNodeId?: string;
input_value?: string;
+ silent?: boolean;
}) => {
get().setIsBuilding(true);
const currentFlow = useFlowsManagerStore.getState().currentFlow;
@@ -536,19 +538,23 @@ const useFlowStore = create((set, get) => ({
startNodeId,
stopNodeId,
onGetOrderSuccess: () => {
- setNoticeData({ title: "Running components" });
+ if (!silent) {
+ setNoticeData({ title: "Running components" });
+ }
},
onBuildComplete: (allNodesValid) => {
const nodeId = startNodeId || stopNodeId;
- if (nodeId && allNodesValid) {
- setSuccessData({
- title: `${
- get().nodes.find((node) => node.id === nodeId)?.data.node
- ?.display_name
- } built successfully`,
- });
- } else {
- setSuccessData({ title: FLOW_BUILD_SUCCESS_ALERT });
+ if (!silent) {
+ if (nodeId && allNodesValid) {
+ setSuccessData({
+ title: `${
+ get().nodes.find((node) => node.id === nodeId)?.data.node
+ ?.display_name
+ } built successfully`,
+ });
+ } else {
+ setSuccessData({ title: FLOW_BUILD_SUCCESS_ALERT });
+ }
}
get().setIsBuilding(false);
},
diff --git a/src/frontend/src/stores/flowsManagerStore.ts b/src/frontend/src/stores/flowsManagerStore.ts
index 52a8eec8c..28537a56e 100644
--- a/src/frontend/src/stores/flowsManagerStore.ts
+++ b/src/frontend/src/stores/flowsManagerStore.ts
@@ -1,4 +1,5 @@
-import { cloneDeep, debounce } from "lodash";
+import { cloneDeep } from "lodash";
+import * as debounce from "debounce-promise";
import { Edge, Node, Viewport, XYPosition } from "reactflow";
import { create } from "zustand";
import { SAVE_DEBOUNCE_TIME } from "../constants/constants";
@@ -86,12 +87,12 @@ const useFlowsManagerStore = create((set, get) => ({
if (dbData) {
const { data, flows } = processFlows(dbData, false);
const examples = flows.filter(
- (flow) => flow.folder_id === starterFolderId
+ (flow) => flow.folder_id === starterFolderId,
);
get().setExamples(examples);
const flowsWithoutStarterFolder = flows.filter(
- (flow) => flow.folder_id !== starterFolderId
+ (flow) => flow.folder_id !== starterFolderId,
);
get().setFlows(flowsWithoutStarterFolder);
@@ -119,7 +120,7 @@ const useFlowsManagerStore = create((set, get) => ({
if (get().currentFlow) {
get().saveFlow(
{ ...get().currentFlow!, data: { nodes, edges, viewport } },
- true
+ true,
);
}
},
@@ -145,7 +146,7 @@ const useFlowsManagerStore = create((set, get) => ({
return updatedFlow;
}
return flow;
- })
+ }),
);
//update tabs state
@@ -194,7 +195,7 @@ const useFlowsManagerStore = create((set, get) => ({
flow?: FlowType,
override?: boolean,
position?: XYPosition,
- fromDragAndDrop?: boolean
+ fromDragAndDrop?: boolean,
): Promise => {
if (newProject) {
let flowData = flow
@@ -210,7 +211,7 @@ const useFlowsManagerStore = create((set, get) => ({
const newFlow = createNewFlow(
flowData!,
flow!,
- folder_id || my_collection_id!
+ folder_id || my_collection_id!,
);
const { id } = await saveFlowToDatabase(newFlow);
newFlow.id = id;
@@ -233,7 +234,7 @@ const useFlowsManagerStore = create((set, get) => ({
const newFlow = createNewFlow(
flowData!,
flow!,
- folder_id || my_collection_id!
+ folder_id || my_collection_id!,
);
const newName = addVersionToDuplicates(newFlow, get().flows);
@@ -269,7 +270,7 @@ const useFlowsManagerStore = create((set, get) => ({
.getState()
.paste(
{ nodes: flow!.data!.nodes, edges: flow!.data!.edges },
- position ?? { x: 10, y: 10 }
+ position ?? { x: 10, y: 10 },
);
}
},
@@ -279,7 +280,7 @@ const useFlowsManagerStore = create((set, get) => ({
multipleDeleteFlowsComponents(id)
.then(() => {
const { data, flows } = processFlows(
- get().flows.filter((flow) => !id.includes(flow.id))
+ get().flows.filter((flow) => !id.includes(flow.id)),
);
get().setFlows(flows);
set({ isLoading: false });
@@ -299,7 +300,7 @@ const useFlowsManagerStore = create((set, get) => ({
deleteFlowFromDatabase(id)
.then(() => {
const { data, flows } = processFlows(
- get().flows.filter((flow) => flow.id !== id)
+ get().flows.filter((flow) => flow.id !== id),
);
get().setFlows(flows);
set({ isLoading: false });
@@ -321,7 +322,7 @@ const useFlowsManagerStore = create((set, get) => ({
return new Promise((resolve) => {
let componentFlow = get().flows.find(
(componentFlow) =>
- componentFlow.is_component && componentFlow.name === key
+ componentFlow.is_component && componentFlow.name === key,
);
if (componentFlow) {
@@ -369,7 +370,7 @@ const useFlowsManagerStore = create((set, get) => ({
fileData,
undefined,
position,
- true
+ true,
);
resolve(id);
}
@@ -410,7 +411,7 @@ const useFlowsManagerStore = create((set, get) => ({
return get().addFlow(
true,
createFlowComponent(component, useDarkStore.getState().version),
- override
+ override,
);
},
takeSnapshot: () => {
@@ -431,7 +432,7 @@ const useFlowsManagerStore = create((set, get) => ({
if (pastLength > 0) {
past[currentFlowId] = past[currentFlowId].slice(
pastLength - defaultOptions.maxHistorySize + 1,
- pastLength
+ pastLength,
);
past[currentFlowId].push(newState);
diff --git a/src/frontend/src/style/applies.css b/src/frontend/src/style/applies.css
index 0fabaa0ee..07ab9b592 100644
--- a/src/frontend/src/style/applies.css
+++ b/src/frontend/src/style/applies.css
@@ -9,7 +9,9 @@
body {
@apply bg-background text-foreground;
- font-feature-settings: "rlig" 1, "calt" 1;
+ font-feature-settings:
+ "rlig" 1,
+ "calt" 1;
}
}
diff --git a/src/frontend/src/types/components/index.ts b/src/frontend/src/types/components/index.ts
index 28848bcc0..199cbb1f2 100644
--- a/src/frontend/src/types/components/index.ts
+++ b/src/frontend/src/types/components/index.ts
@@ -221,6 +221,7 @@ export type AccordionComponentType = {
children?: ReactElement;
open?: string[];
trigger?: string | ReactElement;
+ disabled?: boolean;
keyValue?: string;
openDisc?: boolean;
sideBar?: boolean;
@@ -378,6 +379,7 @@ export type patchUserInputStateType = {
password: string;
cnfPassword: string;
gradient: string;
+ apikey: string;
};
export type UserInputType = {
@@ -519,7 +521,7 @@ export type nodeToolbarPropsType = {
updateNodeCode?: (
newNodeClass: APIClassType,
code: string,
- name: string
+ name: string,
) => void;
setShowState: (show: boolean | SetStateAction) => void;
isOutdated?: boolean;
@@ -569,7 +571,7 @@ export type chatMessagePropsType = {
updateChat: (
chat: ChatMessageType,
message: string,
- stream_url?: string
+ stream_url?: string,
) => void;
};
@@ -661,12 +663,12 @@ export type codeTabsPropsType = {
value: string,
node: NodeType,
template: TemplateVariableType,
- tweak: tweakType
+ tweak: tweakType,
) => string;
buildTweakObject?: (
tw: string,
changes: string | string[] | boolean | number | Object[] | Object,
- template: TemplateVariableType
+ template: TemplateVariableType,
) => Promise;
};
activeTweaks?: boolean;
diff --git a/src/frontend/src/types/zustand/flow/index.ts b/src/frontend/src/types/zustand/flow/index.ts
index 71ff76f55..bbb68d1c9 100644
--- a/src/frontend/src/types/zustand/flow/index.ts
+++ b/src/frontend/src/types/zustand/flow/index.ts
@@ -106,11 +106,12 @@ export type FlowStoreType = {
startNodeId,
stopNodeId,
input_value,
+ silent,
}: {
- nodeId?: string;
startNodeId?: string;
stopNodeId?: string;
input_value?: string;
+ silent?: boolean;
}) => Promise;
getFlow: () => { nodes: Node[]; edges: Edge[]; viewport: Viewport };
updateVerticesBuild: (
diff --git a/src/frontend/src/utils/reactflowUtils.ts b/src/frontend/src/utils/reactflowUtils.ts
index b379b445e..d1f960017 100644
--- a/src/frontend/src/utils/reactflowUtils.ts
+++ b/src/frontend/src/utils/reactflowUtils.ts
@@ -16,6 +16,8 @@ import {
specialCharsRegex,
} from "../constants/constants";
import { downloadFlowsFromDatabase } from "../controllers/API";
+import getFieldTitle from "../customNodes/utils/get-field-title";
+import { DESCRIPTIONS } from "../flow_constants";
import {
APIClassType,
APIKindType,
@@ -37,8 +39,6 @@ import {
updateEdgesHandleIdsType,
} from "../types/utils/reactflowUtils";
import { createRandomKey, toTitleCase } from "./utils";
-import { DESCRIPTIONS } from "../flow_constants";
-import getFieldTitle from "../customNodes/utils/get-field-title";
const uid = new ShortUniqueId({ length: 5 });
export function checkChatInput(nodes: Node[]) {
@@ -99,18 +99,18 @@ export function unselectAllNodes({ updateNodes, data }: unselectAllNodesType) {
export function isValidConnection(
{ source, target, sourceHandle, targetHandle }: Connection,
nodes: Node[],
- edges: Edge[],
+ edges: Edge[]
) {
const targetHandleObject: targetHandleType = scapeJSONParse(targetHandle!);
const sourceHandleObject: sourceHandleType = scapeJSONParse(sourceHandle!);
if (
targetHandleObject.inputTypes?.some(
- (n) => n === sourceHandleObject.dataType,
+ (n) => n === sourceHandleObject.dataType
) ||
sourceHandleObject.baseClasses.some(
(t) =>
targetHandleObject.inputTypes?.some((n) => n === t) ||
- t === targetHandleObject.type,
+ t === targetHandleObject.type
)
) {
let targetNode = nodes.find((node) => node.id === target!)?.data?.node;
@@ -143,7 +143,7 @@ export function removeApiKeys(flow: FlowType): FlowType {
export function updateTemplate(
reference: APITemplateType,
- objectToUpdate: APITemplateType,
+ objectToUpdate: APITemplateType
): APITemplateType {
let clonedObject: APITemplateType = cloneDeep(reference);
@@ -203,7 +203,7 @@ export const processDataFromFlow = (flow: FlowType, refreshIds = true) => {
export function updateIds(
{ edges, nodes }: { edges: Edge[]; nodes: Node[] },
- selection?: { edges: Edge[]; nodes: Node[] },
+ selection?: { edges: Edge[]; nodes: Node[] }
) {
let idsMap = {};
const selectionIds = selection?.nodes.map((n) => n.id);
@@ -231,7 +231,7 @@ export function updateIds(
edge.source = idsMap[edge.source];
edge.target = idsMap[edge.target];
const sourceHandleObject: sourceHandleType = scapeJSONParse(
- edge.sourceHandle!,
+ edge.sourceHandle!
);
edge.sourceHandle = scapedJSONStringfy({
...sourceHandleObject,
@@ -241,7 +241,7 @@ export function updateIds(
edge.data.sourceHandle.id = edge.source;
}
const targetHandleObject: targetHandleType = scapeJSONParse(
- edge.targetHandle!,
+ edge.targetHandle!
);
edge.targetHandle = scapedJSONStringfy({
...targetHandleObject,
@@ -287,11 +287,11 @@ export function validateNode(node: NodeType, edges: Edge[]): Array {
(scapeJSONParse(edge.targetHandle!) as targetHandleType).fieldName ===
t &&
(scapeJSONParse(edge.targetHandle!) as targetHandleType).id ===
- node.id,
+ node.id
)
) {
errors.push(
- `${displayName || type} is missing ${getFieldTitle(template, t)}.`,
+ `${displayName || type} is missing ${getFieldTitle(template, t)}.`
);
} else if (
template[t].type === "dict" &&
@@ -305,15 +305,15 @@ export function validateNode(node: NodeType, edges: Edge[]): Array {
errors.push(
`${displayName || type} (${getFieldTitle(
template,
- t,
- )}) contains duplicate keys with the same values.`,
+ t
+ )}) contains duplicate keys with the same values.`
);
if (hasEmptyKey(template[t].value))
errors.push(
`${displayName || type} (${getFieldTitle(
template,
- t,
- )}) field must not be empty.`,
+ t
+ )}) field must not be empty.`
);
}
return errors;
@@ -322,7 +322,7 @@ export function validateNode(node: NodeType, edges: Edge[]): Array {
export function validateNodes(
nodes: Node[],
- edges: Edge[],
+ edges: Edge[]
): // this returns an array of tuples with the node id and the errors
Array<{ id: string; errors: Array }> {
if (nodes.length === 0) {
@@ -343,19 +343,14 @@ export function updateEdges(edges: Edge[]) {
if (edges)
edges.forEach((edge) => {
const targetHandleObject: targetHandleType = scapeJSONParse(
- edge.targetHandle!,
+ edge.targetHandle!
);
edge.className = "stroke-gray-900 stroke-connection";
});
}
export function addVersionToDuplicates(flow: FlowType, flows: FlowType[]) {
- console.log("flow", flow);
- console.log("flows", flows);
- const existingNames = flows
- .filter((f) => f.folder_id === flow.folder_id)
- .map((item) => item.name);
- console.log("existingNames", existingNames);
+ const existingNames = flows.map((item) => item.name);
let newName = flow.name;
let count = 1;
@@ -415,7 +410,7 @@ export function handleKeyDown(
| React.KeyboardEvent
| React.KeyboardEvent,
inputValue: string | string[] | null,
- block: string,
+ block: string
) {
//condition to fix bug control+backspace on Windows/Linux
if (
@@ -440,7 +435,7 @@ export function handleKeyDown(
}
export function handleOnlyIntegerInput(
- event: React.KeyboardEvent,
+ event: React.KeyboardEvent
) {
if (
event.key === "." ||
@@ -456,7 +451,7 @@ export function handleOnlyIntegerInput(
export function getConnectedNodes(
edge: Edge,
- nodes: Array,
+ nodes: Array
): Array {
const sourceId = edge.source;
const targetId = edge.target;
@@ -557,7 +552,7 @@ export function checkOldEdgesHandles(edges: Edge[]): boolean {
!edge.sourceHandle ||
!edge.targetHandle ||
!edge.sourceHandle.includes("{") ||
- !edge.targetHandle.includes("{"),
+ !edge.targetHandle.includes("{")
);
}
@@ -580,7 +575,7 @@ export function customStringify(obj: any): string {
const keys = Object.keys(obj).sort();
const keyValuePairs = keys.map(
- (key) => `"${key}":${customStringify(obj[key])}`,
+ (key) => `"${key}":${customStringify(obj[key])}`
);
return `{${keyValuePairs.join(",")}}`;
}
@@ -609,7 +604,7 @@ export function getHandleId(
source: string,
sourceHandle: string,
target: string,
- targetHandle: string,
+ targetHandle: string
) {
return (
"reactflow__edge-" + source + sourceHandle + "-" + target + targetHandle
@@ -620,7 +615,7 @@ export function generateFlow(
selection: OnSelectionChangeParams,
nodes: Node[],
edges: Edge[],
- name: string,
+ name: string
): generateFlowType {
const newFlowData = { nodes, edges, viewport: { zoom: 1, x: 0, y: 0 } };
const uid = new ShortUniqueId({ length: 5 });
@@ -629,7 +624,7 @@ export function generateFlow(
newFlowData.edges = selection.edges.filter(
(edge) =>
selection.nodes.some((node) => node.id === edge.target) &&
- selection.nodes.some((node) => node.id === edge.source),
+ selection.nodes.some((node) => node.id === edge.source)
);
newFlowData.nodes = selection.nodes;
@@ -650,7 +645,7 @@ export function generateFlow(
(edge) =>
(selection.nodes.some((node) => node.id === edge.target) ||
selection.nodes.some((node) => node.id === edge.source)) &&
- newFlowData.edges.every((e) => e.id !== edge.id),
+ newFlowData.edges.every((e) => e.id !== edge.id)
),
};
}
@@ -661,13 +656,13 @@ export function reconnectEdges(groupNode: NodeType, excludedEdges: Edge[]) {
const { nodes, edges } = groupNode.data.node!.flow!.data!;
const lastNode = findLastNode(groupNode.data.node!.flow!.data!);
newEdges = newEdges.filter(
- (e) => !(nodes.some((n) => n.id === e.source) && e.source !== lastNode?.id),
+ (e) => !(nodes.some((n) => n.id === e.source) && e.source !== lastNode?.id)
);
newEdges.forEach((edge) => {
if (lastNode && edge.source === lastNode.id) {
edge.source = groupNode.id;
let newSourceHandle: sourceHandleType = scapeJSONParse(
- edge.sourceHandle!,
+ edge.sourceHandle!
);
newSourceHandle.id = groupNode.id;
edge.sourceHandle = scapedJSONStringfy(newSourceHandle);
@@ -694,7 +689,7 @@ export function reconnectEdges(groupNode: NodeType, excludedEdges: Edge[]) {
export function filterFlow(
selection: OnSelectionChangeParams,
setNodes: (update: Node[] | ((oldState: Node[]) => Node[])) => void,
- setEdges: (update: Edge[] | ((oldState: Edge[]) => Edge[])) => void,
+ setEdges: (update: Edge[] | ((oldState: Edge[]) => Edge[])) => void
) {
setNodes((nodes) => nodes.filter((node) => !selection.nodes.includes(node)));
setEdges((edges) => edges.filter((edge) => !selection.edges.includes(edge)));
@@ -732,7 +727,7 @@ export function updateFlowPosition(NewPosition: XYPosition, flow: FlowType) {
export function concatFlows(
flow: FlowType,
setNodes: (update: Node[] | ((oldState: Node[]) => Node[])) => void,
- setEdges: (update: Edge[] | ((oldState: Edge[]) => Edge[])) => void,
+ setEdges: (update: Edge[] | ((oldState: Edge[]) => Edge[])) => void
) {
const { nodes, edges } = flow.data!;
setNodes((old) => [...old, ...nodes]);
@@ -741,7 +736,7 @@ export function concatFlows(
export function validateSelection(
selection: OnSelectionChangeParams,
- edges: Edge[],
+ edges: Edge[]
): Array {
const clonedSelection = cloneDeep(selection);
const clonedEdges = cloneDeep(edges);
@@ -755,7 +750,7 @@ export function validateSelection(
let nodesSet = new Set(clonedSelection.nodes.map((n) => n.id));
// then filter the edges that are connected to the nodes in the set
let connectedEdges = clonedSelection.edges.filter(
- (e) => nodesSet.has(e.source) && nodesSet.has(e.target),
+ (e) => nodesSet.has(e.source) && nodesSet.has(e.target)
);
// add the edges to the selection
clonedSelection.edges = connectedEdges;
@@ -769,17 +764,17 @@ export function validateSelection(
clonedSelection.nodes.some(
(node) =>
isInputNode(node.data as NodeDataType) ||
- isOutputNode(node.data as NodeDataType),
+ isOutputNode(node.data as NodeDataType)
)
) {
errorsArray.push(
- "Please select only nodes that are not input or output nodes",
+ "Please select only nodes that are not input or output nodes"
);
}
//check if there are two or more nodes with free outputs
if (
clonedSelection.nodes.filter(
- (n) => !clonedSelection.edges.some((e) => e.source === n.id),
+ (n) => !clonedSelection.edges.some((e) => e.source === n.id)
).length > 1
) {
errorsArray.push("Please select only one node with free outputs");
@@ -790,7 +785,7 @@ export function validateSelection(
clonedSelection.nodes.some(
(node) =>
!clonedSelection.edges.some((edge) => edge.target === node.id) &&
- !clonedSelection.edges.some((edge) => edge.source === node.id),
+ !clonedSelection.edges.some((edge) => edge.source === node.id)
)
) {
errorsArray.push("Please select only nodes that are connected");
@@ -847,8 +842,8 @@ export function mergeNodeTemplates({
nodeTemplate[key].display_name
? nodeTemplate[key].display_name
: nodeTemplate[key].name
- ? toTitleCase(nodeTemplate[key].name)
- : toTitleCase(key);
+ ? toTitleCase(nodeTemplate[key].name)
+ : toTitleCase(key);
}
}
});
@@ -859,7 +854,7 @@ function isHandleConnected(
edges: Edge[],
key: string,
field: TemplateVariableType,
- nodeId: string,
+ nodeId: string
) {
/*
this function receives a flow and a handleId and check if there is a connection with this handle
@@ -875,7 +870,7 @@ function isHandleConnected(
id: nodeId,
proxy: { id: field.proxy!.id, field: field.proxy!.field },
inputTypes: field.input_types,
- } as targetHandleType),
+ } as targetHandleType)
)
) {
return true;
@@ -890,7 +885,7 @@ function isHandleConnected(
fieldName: key,
id: nodeId,
inputTypes: field.input_types,
- } as targetHandleType),
+ } as targetHandleType)
)
) {
return true;
@@ -913,7 +908,7 @@ export function generateNodeTemplate(Flow: FlowType) {
export function generateNodeFromFlow(
flow: FlowType,
- getNodeId: (type: string) => string,
+ getNodeId: (type: string) => string
): NodeType {
const { nodes } = flow.data!;
const outputNode = cloneDeep(findLastNode(flow.data!));
@@ -944,7 +939,7 @@ export function generateNodeFromFlow(
export function connectedInputNodesOnHandle(
nodeId: string,
handleId: string,
- { nodes, edges }: { nodes: NodeType[]; edges: Edge[] },
+ { nodes, edges }: { nodes: NodeType[]; edges: Edge[] }
) {
const connectedNodes: Array<{ name: string; id: string; isGroup: boolean }> =
[];
@@ -981,7 +976,7 @@ export function connectedInputNodesOnHandle(
export function updateProxyIdsOnTemplate(
template: APITemplateType,
- idsMap: { [key: string]: string },
+ idsMap: { [key: string]: string }
) {
Object.keys(template).forEach((key) => {
if (template[key].proxy && idsMap[template[key].proxy!.id]) {
@@ -992,7 +987,7 @@ export function updateProxyIdsOnTemplate(
export function updateEdgesIds(
edges: Edge[],
- idsMap: { [key: string]: string },
+ idsMap: { [key: string]: string }
) {
edges.forEach((edge) => {
let targetHandle: targetHandleType = edge.data.targetHandle;
@@ -1024,7 +1019,7 @@ export function expandGroupNode(
nodes: Node[],
edges: Edge[],
setNodes: (update: Node[] | ((oldState: Node[]) => Node[])) => void,
- setEdges: (update: Edge[] | ((oldState: Edge[]) => Edge[])) => void,
+ setEdges: (update: Edge[] | ((oldState: Edge[]) => Edge[])) => void
) {
const idsMap = updateIds(flow!.data!);
updateProxyIdsOnTemplate(template, idsMap);
@@ -1067,7 +1062,7 @@ export function expandGroupNode(
const lastNode = cloneDeep(findLastNode(flow!.data!));
newEdge.source = lastNode!.id;
let newSourceHandle: sourceHandleType = scapeJSONParse(
- newEdge.sourceHandle!,
+ newEdge.sourceHandle!
);
newSourceHandle.id = lastNode!.id;
newEdge.data.sourceHandle = newSourceHandle;
@@ -1124,7 +1119,7 @@ export function expandGroupNode(
export function getGroupStatus(
flow: FlowType,
- ssData: { [key: string]: { valid: boolean; params: string } },
+ ssData: { [key: string]: { valid: boolean; params: string } }
) {
let status = { valid: true, params: SUCCESS_BUILD };
const { nodes } = flow.data!;
@@ -1143,7 +1138,7 @@ export function getGroupStatus(
export function createFlowComponent(
nodeData: NodeDataType,
- version: string,
+ version: string
): FlowType {
const flowNode: FlowType = {
data: {
@@ -1179,7 +1174,7 @@ export function downloadNode(NodeFLow: FlowType) {
export function updateComponentNameAndType(
data: any,
- component: NodeDataType,
+ component: NodeDataType
) {}
export function removeFileNameFromComponents(flow: FlowType) {
@@ -1253,7 +1248,7 @@ export function extractFieldsFromComponenents(data: APIObjectType) {
export function downloadFlow(
flow: FlowType,
flowName: string,
- flowDescription?: string,
+ flowDescription?: string
) {
let clonedFlow = cloneDeep(flow);
removeFileNameFromComponents(clonedFlow);
@@ -1263,7 +1258,7 @@ export function downloadFlow(
...clonedFlow,
name: flowName,
description: flowDescription,
- }),
+ })
)}`;
// create a link element and set its properties
@@ -1278,7 +1273,7 @@ export function downloadFlow(
export function downloadFlows() {
downloadFlowsFromDatabase().then((flows) => {
const jsonString = `data:text/json;chatset=utf-8,${encodeURIComponent(
- JSON.stringify(flows),
+ JSON.stringify(flows)
)}`;
// create a link element and set its properties
@@ -1302,7 +1297,7 @@ export function getRandomDescription(): string {
export const createNewFlow = (
flowData: ReactFlowJsonObject,
flow: FlowType,
- folderId: string,
+ folderId: string
) => {
return {
description: flow?.description ?? getRandomDescription(),
diff --git a/src/frontend/src/utils/utils.ts b/src/frontend/src/utils/utils.ts
index 619170124..8544afacc 100644
--- a/src/frontend/src/utils/utils.ts
+++ b/src/frontend/src/utils/utils.ts
@@ -55,7 +55,7 @@ export function normalCaseToSnakeCase(str: string): string {
export function toTitleCase(
str: string | undefined,
- isNodeField?: boolean
+ isNodeField?: boolean,
): string {
if (!str) return "";
let result = str
@@ -64,7 +64,7 @@ export function toTitleCase(
if (isNodeField) return word;
if (index === 0) {
return checkUpperWords(
- word[0].toUpperCase() + word.slice(1).toLowerCase()
+ word[0].toUpperCase() + word.slice(1).toLowerCase(),
);
}
return checkUpperWords(word.toLowerCase());
@@ -77,7 +77,7 @@ export function toTitleCase(
if (isNodeField) return word;
if (index === 0) {
return checkUpperWords(
- word[0].toUpperCase() + word.slice(1).toLowerCase()
+ word[0].toUpperCase() + word.slice(1).toLowerCase(),
);
}
return checkUpperWords(word.toLowerCase());
@@ -181,7 +181,7 @@ export function checkLocalStorageKey(key: string): boolean {
export function IncrementObjectKey(
object: object,
- key: string
+ key: string,
): { newKey: string; increment: number } {
let count = 1;
const type = removeCountFromString(key);
@@ -216,7 +216,7 @@ export function groupByFamily(
data: APIDataType,
baseClasses: string,
left: boolean,
- flow?: NodeType[]
+ flow?: NodeType[],
): groupedObjType[] {
const baseClassesSet = new Set(baseClasses.split("\n"));
let arrOfPossibleInputs: Array<{
@@ -242,7 +242,7 @@ export function groupByFamily(
baseClassesSet.has(template.type)) ||
(template.input_types &&
template.input_types.some((inputType) =>
- baseClassesSet.has(inputType)
+ baseClassesSet.has(inputType),
)))
);
};
@@ -262,7 +262,7 @@ export function groupByFamily(
hasBaseClassInBaseClasses:
foundNode?.hasBaseClassInBaseClasses ||
nodeData.node!.base_classes.some((baseClass) =>
- baseClassesSet.has(baseClass)
+ baseClassesSet.has(baseClass),
), //seta como anterior ou verifica se o node tem base class
displayName: nodeData.node?.display_name,
});
@@ -279,10 +279,10 @@ export function groupByFamily(
if (!foundNode) {
foundNode = {
hasBaseClassInTemplate: Object.values(node!.template).some(
- checkBaseClass
+ checkBaseClass,
),
hasBaseClassInBaseClasses: node!.base_classes.some((baseClass) =>
- baseClassesSet.has(baseClass)
+ baseClassesSet.has(baseClass),
),
displayName: node?.display_name,
};
@@ -351,7 +351,7 @@ export function isTimeStampString(str: string): boolean {
export function extractColumnsFromRows(
rows: object[],
- mode: "intersection" | "union"
+ mode: "intersection" | "union",
): (ColDef | ColGroupDef)[] {
const columnsKeys: { [key: string]: ColDef | ColGroupDef } = {};
if (rows.length === 0) {
diff --git a/src/frontend/tests/end-to-end/chatInputOutput.spec.ts b/src/frontend/tests/end-to-end/chatInputOutput.spec.ts
index 0828e66e6..a1429ddec 100644
--- a/src/frontend/tests/end-to-end/chatInputOutput.spec.ts
+++ b/src/frontend/tests/end-to-end/chatInputOutput.spec.ts
@@ -23,7 +23,7 @@ test("chat_io_teste", async ({ page }) => {
}
const jsonContent = readFileSync(
- "tests/end-to-end/assets/ChatTest.json",
+ "src/frontend/tests/end-to-end/assets/ChatTest.json",
"utf-8"
);
diff --git a/src/frontend/tests/end-to-end/chatInputOutputUser.spec.ts b/src/frontend/tests/end-to-end/chatInputOutputUser.spec.ts
index 7c442116e..4c6b90081 100644
--- a/src/frontend/tests/end-to-end/chatInputOutputUser.spec.ts
+++ b/src/frontend/tests/end-to-end/chatInputOutputUser.spec.ts
@@ -59,6 +59,9 @@ test("user must interact with chat with Input/Output", async ({ page }) => {
.fill(
"testtesttesttesttesttestte;.;.,;,.;,.;.,;,..,;;;;;;;;;;;;;;;;;;;;;,;.;,.;,.,;.,;.;.,~~çççççççççççççççççççççççççççççççççççççççisdajfdasiopjfaodisjhvoicxjiovjcxizopjviopasjioasfhjaiohf23432432432423423sttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttestççççççççççççççççççççççççççççççççç,.,.,.,.,.,.,.,.,.,.,.,.,.,.,.,.,!"
);
+ await page.getByTestId("icon-LucideSend").click();
+ await page.getByText("Close", { exact: true }).click();
+
await page
.getByTestId("popover-anchor-input-sender_name")
.nth(1)
@@ -68,7 +71,7 @@ test("user must interact with chat with Input/Output", async ({ page }) => {
.nth(0)
.fill("TestSenderNameAI");
- await page.getByText("Playground", { exact: true }).click();
+ await page.getByText("Playground", { exact: true }).last().click();
await page.getByTestId("icon-LucideSend").click();
valueUser = await page
diff --git a/src/frontend/tests/end-to-end/deleteComponentFlows.spec.ts b/src/frontend/tests/end-to-end/deleteComponentFlows.spec.ts
index 854ac388c..836e8fe81 100644
--- a/src/frontend/tests/end-to-end/deleteComponentFlows.spec.ts
+++ b/src/frontend/tests/end-to-end/deleteComponentFlows.spec.ts
@@ -11,6 +11,7 @@ test("shoud delete a flow", async ({ page }) => {
.fill(process.env.STORE_API_KEY ?? "");
await page.getByText("Save").last().click();
await page.waitForTimeout(8000);
+ await page.getByText("Store").nth(0).click();
await page.getByTestId("install-Website Content QA").click();
await page.waitForTimeout(5000);
diff --git a/src/frontend/tests/end-to-end/dragAndDrop.spec.ts b/src/frontend/tests/end-to-end/dragAndDrop.spec.ts
index ee1789d2d..3e760901d 100644
--- a/src/frontend/tests/end-to-end/dragAndDrop.spec.ts
+++ b/src/frontend/tests/end-to-end/dragAndDrop.spec.ts
@@ -26,8 +26,8 @@ test.describe("drag and drop test", () => {
await page.locator("span").filter({ hasText: "My Collection" }).isVisible();
// Read your file into a buffer.
const jsonContent = readFileSync(
- "tests/end-to-end/assets/collection.json",
- "utf-8",
+ "src/frontend/tests/end-to-end/assets/collection.json",
+ "utf-8"
);
// Create the DataTransfer and File
@@ -47,7 +47,7 @@ test.describe("drag and drop test", () => {
"drop",
{
dataTransfer,
- },
+ }
);
const genericNoda = page.getByTestId("div-generic-node");
diff --git a/src/frontend/tests/end-to-end/dropdownComponent.spec.ts b/src/frontend/tests/end-to-end/dropdownComponent.spec.ts
index 186226a54..70c1b4a64 100644
--- a/src/frontend/tests/end-to-end/dropdownComponent.spec.ts
+++ b/src/frontend/tests/end-to-end/dropdownComponent.spec.ts
@@ -78,32 +78,32 @@ test("dropDownComponent", async ({ page }) => {
await page.locator('//*[@id="showcredentials_profile_name"]').click();
expect(
- await page.locator('//*[@id="showcredentials_profile_name"]').isChecked(),
+ await page.locator('//*[@id="showcredentials_profile_name"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showcredentials_profile_name"]').click();
expect(
- await page.locator('//*[@id="showcredentials_profile_name"]').isChecked(),
+ await page.locator('//*[@id="showcredentials_profile_name"]').isChecked()
).toBeTruthy();
await page.locator('//*[@id="showendpoint_url"]').click();
expect(
- await page.locator('//*[@id="showendpoint_url"]').isChecked(),
+ await page.locator('//*[@id="showendpoint_url"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showendpoint_url"]').click();
expect(
- await page.locator('//*[@id="showendpoint_url"]').isChecked(),
+ await page.locator('//*[@id="showendpoint_url"]').isChecked()
).toBeTruthy();
await page.locator('//*[@id="showregion_name"]').click();
expect(
- await page.locator('//*[@id="showregion_name"]').isChecked(),
+ await page.locator('//*[@id="showregion_name"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showregion_name"]').click();
expect(
- await page.locator('//*[@id="showregion_name"]').isChecked(),
+ await page.locator('//*[@id="showregion_name"]').isChecked()
).toBeTruthy();
// showmodel_id
@@ -113,7 +113,7 @@ test("dropDownComponent", async ({ page }) => {
// showmodel_id
await page.locator('//*[@id="showmodel_id"]').click();
expect(
- await page.locator('//*[@id="showmodel_id"]').isChecked(),
+ await page.locator('//*[@id="showmodel_id"]').isChecked()
).toBeTruthy();
await page.locator('//*[@id="showcache"]').click();
@@ -124,32 +124,32 @@ test("dropDownComponent", async ({ page }) => {
await page.locator('//*[@id="showcredentials_profile_name"]').click();
expect(
- await page.locator('//*[@id="showcredentials_profile_name"]').isChecked(),
+ await page.locator('//*[@id="showcredentials_profile_name"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showcredentials_profile_name"]').click();
expect(
- await page.locator('//*[@id="showcredentials_profile_name"]').isChecked(),
+ await page.locator('//*[@id="showcredentials_profile_name"]').isChecked()
).toBeTruthy();
await page.locator('//*[@id="showendpoint_url"]').click();
expect(
- await page.locator('//*[@id="showendpoint_url"]').isChecked(),
+ await page.locator('//*[@id="showendpoint_url"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showendpoint_url"]').click();
expect(
- await page.locator('//*[@id="showendpoint_url"]').isChecked(),
+ await page.locator('//*[@id="showendpoint_url"]').isChecked()
).toBeTruthy();
await page.locator('//*[@id="showregion_name"]').click();
expect(
- await page.locator('//*[@id="showregion_name"]').isChecked(),
+ await page.locator('//*[@id="showregion_name"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showregion_name"]').click();
expect(
- await page.locator('//*[@id="showregion_name"]').isChecked(),
+ await page.locator('//*[@id="showregion_name"]').isChecked()
).toBeTruthy();
// showmodel_id
@@ -159,7 +159,7 @@ test("dropDownComponent", async ({ page }) => {
// showmodel_id
await page.locator('//*[@id="showmodel_id"]').click();
expect(
- await page.locator('//*[@id="showmodel_id"]').isChecked(),
+ await page.locator('//*[@id="showmodel_id"]').isChecked()
).toBeTruthy();
await page.getByTestId("dropdown-edit-model_id").click();
@@ -170,7 +170,7 @@ test("dropDownComponent", async ({ page }) => {
expect(false).toBeTruthy();
}
- await page.locator('//*[@id="saveChangesBtn"]').click();
+ await page.getByText("Save Changes", { exact: true }).click();
value = await page.getByTestId("dropdown-model_id").innerText();
if (value !== "ai21.j2-mid-v1") {
diff --git a/src/frontend/tests/end-to-end/filterEdge.spec.ts b/src/frontend/tests/end-to-end/filterEdge.spec.ts
index 7ba874f43..e2f7579c5 100644
--- a/src/frontend/tests/end-to-end/filterEdge.spec.ts
+++ b/src/frontend/tests/end-to-end/filterEdge.spec.ts
@@ -40,7 +40,7 @@ test("LLMChain - Tooltip", async ({ page }) => {
await page
.locator(
- '//*[@id="react-flow-id"]/div[1]/div[1]/div/div/div[2]/div/div/div[2]/div[3]/div/button/div/div',
+ '//*[@id="react-flow-id"]/div[1]/div[1]/div/div/div[2]/div/div/div[2]/div[3]/div/button/div/div'
)
.hover()
.then(async () => {
@@ -60,17 +60,17 @@ test("LLMChain - Tooltip", async ({ page }) => {
await page.getByTitle("zoom out").click();
await page
.locator(
- '//*[@id="react-flow-id"]/div[1]/div[1]/div/div/div[2]/div/div/div[2]/div[4]/div/button/div/div',
+ '//*[@id="react-flow-id"]/div[1]/div[1]/div/div/div[2]/div/div/div[2]/div[4]/div/button/div/div'
)
.hover()
.then(async () => {
await expect(
- page.getByTestId("tooltip-Model Specs").first(),
+ page.getByTestId("tooltip-Model Specs").first()
).toBeVisible();
await page.waitForTimeout(2000);
await expect(
- page.getByTestId("tooltip-Model Specs").first(),
+ page.getByTestId("tooltip-Model Specs").first()
).toBeVisible();
await page.getByTestId("icon-Search").click();
@@ -81,12 +81,12 @@ test("LLMChain - Tooltip", async ({ page }) => {
await page
.locator(
- '//*[@id="react-flow-id"]/div[1]/div[1]/div/div/div[2]/div/div/div[2]/div[5]/div/button/div/div',
+ '//*[@id="react-flow-id"]/div[1]/div[1]/div/div/div[2]/div/div/div[2]/div[5]/div/button/div/div'
)
.hover()
.then(async () => {
await expect(
- page.getByTestId("empty-tooltip-filter").first(),
+ page.getByTestId("empty-tooltip-filter").first()
).toBeVisible();
});
});
@@ -113,7 +113,7 @@ test("LLMChain - Filter", async ({ page }) => {
await page.waitForTimeout(1000);
await page.getByTestId(
- "input-list-plus-btn-edit_metadata_indexing_include-2",
+ "input-list-plus-btn-edit_metadata_indexing_include-2"
);
await page.getByTestId("blank-flow").click();
@@ -136,7 +136,7 @@ test("LLMChain - Filter", async ({ page }) => {
await page
.locator(
- '//*[@id="react-flow-id"]/div/div[1]/div[1]/div/div[2]/div/div/div[2]/div[4]/div/button/div/div',
+ '//*[@id="react-flow-id"]/div/div[1]/div[1]/div/div[2]/div/div/div[2]/div[4]/div/button/div/div'
)
.click();
@@ -148,16 +148,15 @@ test("LLMChain - Filter", async ({ page }) => {
await expect(page.getByTestId("model_specsChatOllama")).toBeVisible();
await expect(page.getByTestId("model_specsChatOpenAI")).toBeVisible();
await expect(page.getByTestId("model_specsChatVertexAI")).toBeVisible();
- await expect(page.getByTestId("model_specsCohere")).toBeVisible();
await expect(
- page.getByTestId("model_specsGoogle Generative AI"),
+ page.getByTestId("model_specsGoogle Generative AI")
).toBeVisible();
await expect(
- page.getByTestId("model_specsHugging Face Inference API"),
+ page.getByTestId("model_specsHugging Face Inference API")
).toBeVisible();
await expect(page.getByTestId("model_specsOllama")).toBeVisible();
await expect(
- page.getByTestId("model_specsQianfanChatEndpoint"),
+ page.getByTestId("model_specsQianfanChatEndpoint")
).toBeVisible();
await expect(page.getByTestId("model_specsQianfanLLMEndpoint")).toBeVisible();
await expect(page.getByTestId("model_specsVertexAI")).toBeVisible();
@@ -169,24 +168,23 @@ test("LLMChain - Filter", async ({ page }) => {
await expect(page.getByTestId("model_specsAmazon Bedrock")).not.toBeVisible();
await expect(page.getByTestId("modelsAzure OpenAI")).not.toBeVisible();
await expect(
- page.getByTestId("model_specsAzureChatOpenAI"),
+ page.getByTestId("model_specsAzureChatOpenAI")
).not.toBeVisible();
await expect(page.getByTestId("model_specsChatAnthropic")).not.toBeVisible();
await expect(page.getByTestId("model_specsChatLiteLLM")).not.toBeVisible();
await expect(page.getByTestId("model_specsChatOllama")).not.toBeVisible();
await expect(page.getByTestId("model_specsChatOpenAI")).not.toBeVisible();
await expect(page.getByTestId("model_specsChatVertexAI")).not.toBeVisible();
- await expect(page.getByTestId("model_specsCohere")).not.toBeVisible();
await page
.locator(
- '//*[@id="react-flow-id"]/div/div[1]/div[1]/div/div[2]/div/div/div[2]/div[7]/button/div/div',
+ '//*[@id="react-flow-id"]/div/div[1]/div[1]/div/div[2]/div/div/div[2]/div[7]/button/div/div'
)
.click();
await page
.locator(
- '//*[@id="react-flow-id"]/div/div[1]/div[1]/div/div[2]/div/div/div[2]/div[7]/button/div/div',
+ '//*[@id="react-flow-id"]/div/div[1]/div[1]/div/div[2]/div/div/div[2]/div[7]/button/div/div'
)
.click();
diff --git a/src/frontend/tests/end-to-end/floatComponent.spec.ts b/src/frontend/tests/end-to-end/floatComponent.spec.ts
index 70f7bae7e..4989a0f5f 100644
--- a/src/frontend/tests/end-to-end/floatComponent.spec.ts
+++ b/src/frontend/tests/end-to-end/floatComponent.spec.ts
@@ -60,44 +60,38 @@ test("FloatComponent", async ({ page }) => {
await page.getByTestId("more-options-modal").click();
await page.getByTestId("edit-button-modal").click();
- await page.locator('//*[@id="showcache"]').click();
- expect(await page.locator('//*[@id="showcache"]').isChecked()).toBeTruthy();
-
- await page.locator('//*[@id="showcache"]').click();
- expect(await page.locator('//*[@id="showcache"]').isChecked()).toBeFalsy();
-
await page.getByTestId("showformat").click();
expect(await page.locator('//*[@id="showformat"]').isChecked()).toBeTruthy();
await page.getByTestId("showformat").click();
expect(await page.locator('//*[@id="showformat"]').isChecked()).toBeFalsy();
- await page.getByTestId("showmirostat").click();
- expect(
- await page.locator('//*[@id="showmirostat"]').isChecked(),
- ).toBeTruthy();
-
await page.getByTestId("showmirostat").click();
expect(await page.locator('//*[@id="showmirostat"]').isChecked()).toBeFalsy();
- await page.getByTestId("showmirostat_eta").click();
+ await page.getByTestId("showmirostat").click();
expect(
- await page.locator('//*[@id="showmirostat_eta"]').isChecked(),
+ await page.locator('//*[@id="showmirostat"]').isChecked()
).toBeTruthy();
await page.getByTestId("showmirostat_eta").click();
expect(
- await page.locator('//*[@id="showmirostat_eta"]').isChecked(),
+ await page.locator('//*[@id="showmirostat_eta"]').isChecked()
+ ).toBeTruthy();
+
+ await page.getByTestId("showmirostat_eta").click();
+ expect(
+ await page.locator('//*[@id="showmirostat_eta"]').isChecked()
).toBeFalsy();
await page.getByTestId("showmirostat_tau").click();
expect(
- await page.locator('//*[@id="showmirostat_tau"]').isChecked(),
+ await page.locator('//*[@id="showmirostat_tau"]').isChecked()
).toBeTruthy();
await page.getByTestId("showmirostat_tau").click();
expect(
- await page.locator('//*[@id="showmirostat_tau"]').isChecked(),
+ await page.locator('//*[@id="showmirostat_tau"]').isChecked()
).toBeFalsy();
await page.getByTestId("showmodel").click();
@@ -120,25 +114,25 @@ test("FloatComponent", async ({ page }) => {
await page.getByTestId("shownum_thread").click();
expect(
- await page.locator('//*[@id="shownum_thread"]').isChecked(),
+ await page.locator('//*[@id="shownum_thread"]').isChecked()
).toBeTruthy();
await page.getByTestId("shownum_thread").click();
expect(
- await page.locator('//*[@id="shownum_thread"]').isChecked(),
+ await page.locator('//*[@id="shownum_thread"]').isChecked()
).toBeFalsy();
await page.getByTestId("showrepeat_last_n").click();
expect(
- await page.locator('//*[@id="showrepeat_last_n"]').isChecked(),
+ await page.locator('//*[@id="showrepeat_last_n"]').isChecked()
).toBeTruthy();
await page.getByTestId("showrepeat_last_n").click();
expect(
- await page.locator('//*[@id="showrepeat_last_n"]').isChecked(),
+ await page.locator('//*[@id="showrepeat_last_n"]').isChecked()
).toBeFalsy();
- await page.locator('//*[@id="saveChangesBtn"]').click();
+ await page.getByText("Save Changes", { exact: true }).click();
const plusButtonLocator = page.locator('//*[@id="float-input"]');
const elementCount = await plusButtonLocator?.count();
@@ -151,10 +145,10 @@ test("FloatComponent", async ({ page }) => {
// showtemperature
await page.locator('//*[@id="showtemperature"]').click();
expect(
- await page.locator('//*[@id="showtemperature"]').isChecked(),
+ await page.locator('//*[@id="showtemperature"]').isChecked()
).toBeTruthy();
- await page.locator('//*[@id="saveChangesBtn"]').click();
+ await page.getByText("Save Changes", { exact: true }).click();
await page.locator('//*[@id="float-input"]').click();
await page.locator('//*[@id="float-input"]').fill("3");
diff --git a/src/frontend/tests/end-to-end/flowSettings.spec.ts b/src/frontend/tests/end-to-end/flowSettings.spec.ts
index 607c64677..a2e9f25c0 100644
--- a/src/frontend/tests/end-to-end/flowSettings.spec.ts
+++ b/src/frontend/tests/end-to-end/flowSettings.spec.ts
@@ -29,7 +29,7 @@ test("flowSettings", async ({ page }) => {
await page
.getByPlaceholder("Flow name")
.fill(
- "Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test",
+ "Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test"
);
await page.getByText("Character limit reached").isVisible();
@@ -41,10 +41,12 @@ test("flowSettings", async ({ page }) => {
await page
.getByPlaceholder("Flow description")
.fill(
- "Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test",
+ "Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test Flow Name Test"
);
- await page.getByText("Save").last().click();
+ await page.getByTestId("save-flow-settings").click();
+
+ await page.getByText("Close").last().click();
await page.waitForTimeout(1000);
diff --git a/src/frontend/tests/end-to-end/folders.spec.ts b/src/frontend/tests/end-to-end/folders.spec.ts
index 3b82a5f75..384816b3f 100644
--- a/src/frontend/tests/end-to-end/folders.spec.ts
+++ b/src/frontend/tests/end-to-end/folders.spec.ts
@@ -31,33 +31,16 @@ test("CRUD folders", async ({ page }) => {
await page.getByText("All").first().isVisible();
await page.getByText("Select All").isVisible();
- await page.getByText("New Folder", { exact: true }).last().click();
- await page.getByPlaceholder("Insert a name for the folder").fill("test");
- await page
- .getByPlaceholder("Insert a description for the folder")
- .fill("test");
- await page.getByText("Save Folder").click();
-
+ await page.getByTestId("add-folder-button").click();
+ await page.getByText("New Folder").last().isVisible();
await page.waitForTimeout(1000);
- await page.getByText("Folder created succefully").isVisible();
- await page.getByText("test").last().isVisible();
- await page
- .getByText("test")
- .last()
- .hover()
- .then(async () => {
- await page.getByTestId("icon-pencil").last().click();
- });
-
- await page.getByPlaceholder("Insert a name for the folder").fill("test edit");
- await page
- .getByPlaceholder("Insert a description for the folder")
- .fill("test edit");
- await page.getByText("Edit Folder").last().click();
- await page.getByText("test edit").last().isVisible();
+ await page.getByText("New Folder").last().dblclick();
+ await page.getByTestId("input-folder").fill("new folder test name");
+ await page.keyboard.press("Enter");
+ await page.getByText("new folder test name").last().isVisible();
await page
- .getByText("test edit")
+ .getByText("new folder test name")
.last()
.hover()
.then(async () => {
@@ -74,8 +57,8 @@ test("add folder by drag and drop", async ({ page }) => {
await page.waitForTimeout(2000);
const jsonContent = readFileSync(
- "tests/end-to-end/assets/collection.json",
- "utf-8",
+ "src/frontend/tests/end-to-end/assets/collection.json",
+ "utf-8"
);
// Create the DataTransfer and File
@@ -95,7 +78,7 @@ test("add folder by drag and drop", async ({ page }) => {
"drop",
{
dataTransfer,
- },
+ }
);
await page.getByText("Getting Started").first().isVisible();
@@ -131,17 +114,13 @@ test("change flow folder", async ({ page }) => {
await page.getByText("All").first().isVisible();
await page.getByText("Select All").isVisible();
- await page.getByText("New Folder", { exact: true }).last().click();
- await page.getByPlaceholder("Insert a name for the folder").fill("test");
- await page
- .getByPlaceholder("Insert a description for the folder")
- .fill("test");
-
- await page.getByText("Save Folder").click();
-
+ await page.getByTestId("add-folder-button").click();
+ await page.getByText("New Folder").last().isVisible();
await page.waitForTimeout(1000);
- await page.getByText("Folder created succefully").isVisible();
- await page.getByText("test").last().isVisible();
+ await page.getByText("New Folder").last().dblclick();
+ await page.getByTestId("input-folder").fill("new folder test name");
+ await page.keyboard.press("Enter");
+ await page.getByText("new folder test name").last().isVisible();
await page.getByText("My Projects").last().click();
await page.getByText("Basic Prompting").first().hover();
diff --git a/src/frontend/tests/end-to-end/globalVariables.spec.ts b/src/frontend/tests/end-to-end/globalVariables.spec.ts
index 920ba743b..546c332f6 100644
--- a/src/frontend/tests/end-to-end/globalVariables.spec.ts
+++ b/src/frontend/tests/end-to-end/globalVariables.spec.ts
@@ -69,7 +69,6 @@ test("GlobalVariables", async ({ page }) => {
await page.getByText("Save Variable", { exact: true }).click();
expect(page.getByText(credentialName, { exact: true })).not.toBeNull();
await page.getByText(credentialName, { exact: true }).isVisible();
- await page.getByText("Save Variable", { exact: true }).click();
await page.waitForTimeout(2000);
await page
diff --git a/src/frontend/tests/end-to-end/inputComponent.spec.ts b/src/frontend/tests/end-to-end/inputComponent.spec.ts
index 2fcbc9831..5a84c9f67 100644
--- a/src/frontend/tests/end-to-end/inputComponent.spec.ts
+++ b/src/frontend/tests/end-to-end/inputComponent.spec.ts
@@ -60,69 +60,69 @@ test("InputComponent", async ({ page }) => {
expect(
await page
.locator('//*[@id="showchroma_server_cors_allow_origins"]')
- .isChecked(),
+ .isChecked()
).toBeTruthy();
await page.locator('//*[@id="showchroma_server_grpc_port"]').click();
expect(
- await page.locator('//*[@id="showchroma_server_grpc_port"]').isChecked(),
+ await page.locator('//*[@id="showchroma_server_grpc_port"]').isChecked()
).toBeTruthy();
await page.locator('//*[@id="showchroma_server_host"]').click();
expect(
- await page.locator('//*[@id="showchroma_server_host"]').isChecked(),
+ await page.locator('//*[@id="showchroma_server_host"]').isChecked()
).toBeTruthy();
- await page.locator('//*[@id="showchroma_server_port"]').click();
+ await page.locator('//*[@id="showchroma_server_http_port"]').click();
expect(
- await page.locator('//*[@id="showchroma_server_port"]').isChecked(),
+ await page.locator('//*[@id="showchroma_server_http_port"]').isChecked()
).toBeTruthy();
await page.locator('//*[@id="showchroma_server_ssl_enabled"]').click();
expect(
- await page.locator('//*[@id="showchroma_server_ssl_enabled"]').isChecked(),
+ await page.locator('//*[@id="showchroma_server_ssl_enabled"]').isChecked()
).toBeTruthy();
await page.locator('//*[@id="showcollection_name"]').click();
expect(
- await page.locator('//*[@id="showcollection_name"]').isChecked(),
+ await page.locator('//*[@id="showcollection_name"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showindex_directory"]').click();
expect(
- await page.locator('//*[@id="showindex_directory"]').isChecked(),
+ await page.locator('//*[@id="showindex_directory"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showchroma_server_cors_allow_origins"]').click();
expect(
await page
.locator('//*[@id="showchroma_server_cors_allow_origins"]')
- .isChecked(),
+ .isChecked()
).toBeFalsy();
await page.locator('//*[@id="showchroma_server_grpc_port"]').click();
expect(
- await page.locator('//*[@id="showchroma_server_grpc_port"]').isChecked(),
+ await page.locator('//*[@id="showchroma_server_grpc_port"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showchroma_server_host"]').click();
expect(
- await page.locator('//*[@id="showchroma_server_host"]').isChecked(),
+ await page.locator('//*[@id="showchroma_server_host"]').isChecked()
).toBeFalsy();
- await page.locator('//*[@id="showchroma_server_port"]').click();
+ await page.locator('//*[@id="showchroma_server_http_port"]').click();
expect(
- await page.locator('//*[@id="showchroma_server_port"]').isChecked(),
+ await page.locator('//*[@id="showchroma_server_http_port"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showchroma_server_ssl_enabled"]').click();
expect(
- await page.locator('//*[@id="showchroma_server_ssl_enabled"]').isChecked(),
+ await page.locator('//*[@id="showchroma_server_ssl_enabled"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showindex_directory"]').click();
expect(
- await page.locator('//*[@id="showindex_directory"]').isChecked(),
+ await page.locator('//*[@id="showindex_directory"]').isChecked()
).toBeTruthy();
let valueEditNode = await page
@@ -138,7 +138,7 @@ test("InputComponent", async ({ page }) => {
.getByTestId("popover-anchor-input-collection_name-edit")
.fill("NEW_collection_name_test_123123123!@#$&*(&%$@ÇÇÇÀõe");
- await page.locator('//*[@id="saveChangesBtn"]').click();
+ await page.getByText("Save Changes", { exact: true }).click();
const plusButtonLocator = page.getByTestId("input-collection_name");
const elementCount = await plusButtonLocator?.count();
@@ -152,10 +152,10 @@ test("InputComponent", async ({ page }) => {
await page.locator('//*[@id="showcollection_name"]').click();
expect(
- await page.locator('//*[@id="showcollection_name"]').isChecked(),
+ await page.locator('//*[@id="showcollection_name"]').isChecked()
).toBeTruthy();
- await page.locator('//*[@id="saveChangesBtn"]').click();
+ await page.getByText("Save Changes", { exact: true }).click();
let value = await page
.getByTestId("popover-anchor-input-collection_name")
diff --git a/src/frontend/tests/end-to-end/inputListComponent.spec.ts b/src/frontend/tests/end-to-end/inputListComponent.spec.ts
index a434a4529..5b6dcdea8 100644
--- a/src/frontend/tests/end-to-end/inputListComponent.spec.ts
+++ b/src/frontend/tests/end-to-end/inputListComponent.spec.ts
@@ -68,7 +68,7 @@ test("InputListComponent", async ({ page }) => {
.getByTestId("input-list-input-edit_metadata_indexing_include-1")
.fill("test1 test1 test1 test1");
- await page.locator('//*[@id="saveChangesBtn"]').click();
+ await page.getByText("Save Changes", { exact: true }).click();
await page
.getByTestId("input-list-input_metadata_indexing_include-0")
diff --git a/src/frontend/tests/end-to-end/intComponent.spec.ts b/src/frontend/tests/end-to-end/intComponent.spec.ts
index e2a539f01..a9af11987 100644
--- a/src/frontend/tests/end-to-end/intComponent.spec.ts
+++ b/src/frontend/tests/end-to-end/intComponent.spec.ts
@@ -39,6 +39,12 @@ test("IntComponent", async ({ page }) => {
await page.getByTitle("zoom out").click();
await page.getByTitle("zoom out").click();
await page.getByTitle("zoom out").click();
+
+ await page.getByTestId("more-options-modal").click();
+ await page.getByTestId("edit-button-modal").click();
+ await page.getByTestId("showmax_tokens").click();
+
+ await page.getByText("Save Changes", { exact: true }).click();
await page.getByTestId("int-input-max_tokens").click();
await page
.getByTestId("int-input-max_tokens")
@@ -81,80 +87,80 @@ test("IntComponent", async ({ page }) => {
await page.locator('//*[@id="showmodel_kwargs"]').click();
expect(
- await page.locator('//*[@id="showmodel_kwargs"]').isChecked(),
+ await page.locator('//*[@id="showmodel_kwargs"]').isChecked()
).toBeTruthy();
await page.locator('//*[@id="showmodel_name"]').click();
expect(
- await page.locator('//*[@id="showmodel_name"]').isChecked(),
+ await page.locator('//*[@id="showmodel_name"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showopenai_api_base"]').click();
expect(
- await page.locator('//*[@id="showopenai_api_base"]').isChecked(),
+ await page.locator('//*[@id="showopenai_api_base"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showopenai_api_key"]').click();
expect(
- await page.locator('//*[@id="showopenai_api_key"]').isChecked(),
+ await page.locator('//*[@id="showopenai_api_key"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showtemperature"]').click();
expect(
- await page.locator('//*[@id="showtemperature"]').isChecked(),
+ await page.locator('//*[@id="showtemperature"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showmodel_kwargs"]').click();
expect(
- await page.locator('//*[@id="showmodel_kwargs"]').isChecked(),
+ await page.locator('//*[@id="showmodel_kwargs"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showmodel_name"]').click();
expect(
- await page.locator('//*[@id="showmodel_name"]').isChecked(),
+ await page.locator('//*[@id="showmodel_name"]').isChecked()
).toBeTruthy();
await page.locator('//*[@id="showopenai_api_base"]').click();
expect(
- await page.locator('//*[@id="showopenai_api_base"]').isChecked(),
+ await page.locator('//*[@id="showopenai_api_base"]').isChecked()
).toBeTruthy();
await page.locator('//*[@id="showopenai_api_key"]').click();
expect(
- await page.locator('//*[@id="showopenai_api_key"]').isChecked(),
+ await page.locator('//*[@id="showopenai_api_key"]').isChecked()
).toBeTruthy();
await page.locator('//*[@id="showtemperature"]').click();
expect(
- await page.locator('//*[@id="showtemperature"]').isChecked(),
+ await page.locator('//*[@id="showtemperature"]').isChecked()
).toBeTruthy();
await page.locator('//*[@id="showmodel_kwargs"]').click();
expect(
- await page.locator('//*[@id="showmodel_kwargs"]').isChecked(),
+ await page.locator('//*[@id="showmodel_kwargs"]').isChecked()
).toBeTruthy();
await page.locator('//*[@id="showmodel_name"]').click();
expect(
- await page.locator('//*[@id="showmodel_name"]').isChecked(),
+ await page.locator('//*[@id="showmodel_name"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showopenai_api_base"]').click();
expect(
- await page.locator('//*[@id="showopenai_api_base"]').isChecked(),
+ await page.locator('//*[@id="showopenai_api_base"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showopenai_api_key"]').click();
expect(
- await page.locator('//*[@id="showopenai_api_key"]').isChecked(),
+ await page.locator('//*[@id="showopenai_api_key"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showtemperature"]').click();
expect(
- await page.locator('//*[@id="showtemperature"]').isChecked(),
+ await page.locator('//*[@id="showtemperature"]').isChecked()
).toBeFalsy();
- await page.locator('//*[@id="saveChangesBtn"]').click();
+ await page.getByText("Save Changes", { exact: true }).click();
const plusButtonLocator = page.getByTestId("int-input-max_tokens");
const elementCount = await plusButtonLocator?.count();
@@ -166,7 +172,7 @@ test("IntComponent", async ({ page }) => {
await page.locator('//*[@id="showtimeout"]').click();
expect(
- await page.locator('//*[@id="showtimeout"]').isChecked(),
+ await page.locator('//*[@id="showtimeout"]').isChecked()
).toBeTruthy();
const valueEditNode = await page
@@ -177,7 +183,7 @@ test("IntComponent", async ({ page }) => {
expect(false).toBeTruthy();
}
- await page.locator('//*[@id="saveChangesBtn"]').click();
+ await page.getByText("Save Changes", { exact: true }).click();
await page.getByTestId("int-input-max_tokens").click();
await page.getByTestId("int-input-max_tokens").fill("3");
diff --git a/src/frontend/tests/end-to-end/keyPairListComponent.spec.ts b/src/frontend/tests/end-to-end/keyPairListComponent.spec.ts
index 42e6ccd4f..e5763bd2f 100644
--- a/src/frontend/tests/end-to-end/keyPairListComponent.spec.ts
+++ b/src/frontend/tests/end-to-end/keyPairListComponent.spec.ts
@@ -81,9 +81,9 @@ test("KeypairListComponent", async ({ page }) => {
expect(await page.locator('//*[@id="showcache"]').isChecked()).toBeFalsy();
await page.locator('//*[@id="showcredentials_profile_name"]').click();
expect(
- await page.locator('//*[@id="showcredentials_profile_name"]').isChecked(),
+ await page.locator('//*[@id="showcredentials_profile_name"]').isChecked()
).toBeFalsy();
- await page.locator('//*[@id="saveChangesBtn"]').click();
+ await page.getByText("Save Changes", { exact: true }).click();
const plusButtonLocator = page.locator('//*[@id="plusbtn0"]');
const elementCount = await plusButtonLocator?.count();
@@ -96,7 +96,7 @@ test("KeypairListComponent", async ({ page }) => {
await page.locator('//*[@id="showcredentials_profile_name"]').click();
expect(
- await page.locator('//*[@id="showcredentials_profile_name"]').isChecked(),
+ await page.locator('//*[@id="showcredentials_profile_name"]').isChecked()
).toBeTruthy();
await page.locator('//*[@id="showcache"]').click();
expect(await page.locator('//*[@id="showcache"]').isChecked()).toBeTruthy();
@@ -108,7 +108,7 @@ test("KeypairListComponent", async ({ page }) => {
const elementKeyCount = await keyPairVerification?.count();
if (elementKeyCount === 1) {
- await page.locator('//*[@id="saveChangesBtn"]').click();
+ await page.getByText("Save Changes", { exact: true }).click();
await page.getByTestId("div-generic-node").click();
diff --git a/src/frontend/tests/end-to-end/langflowShortcuts.spec.ts b/src/frontend/tests/end-to-end/langflowShortcuts.spec.ts
index 9d9688410..77d35cb03 100644
--- a/src/frontend/tests/end-to-end/langflowShortcuts.spec.ts
+++ b/src/frontend/tests/end-to-end/langflowShortcuts.spec.ts
@@ -47,11 +47,11 @@ test("LangflowShortcuts", async ({ page }) => {
await page.getByTitle("zoom out").click();
await page.getByTitle("zoom out").click();
await page.getByTitle("zoom out").click();
- await page.getByTestId("title-Ollama").click();
+ await page.getByTestId("generic-node-title-arrangement").click();
await page.keyboard.press(`${control}+Shift+A`);
- await page.locator('//*[@id="saveChangesBtn"]').click();
+ await page.getByText("Save Changes", { exact: true }).click();
- await page.getByTestId("title-Ollama").click();
+ await page.getByTestId("generic-node-title-arrangement").click();
await page.keyboard.press(`${control}+d`);
let numberOfNodes = await page.getByTestId("title-Ollama")?.count();
@@ -61,7 +61,7 @@ test("LangflowShortcuts", async ({ page }) => {
await page
.locator(
- '//*[@id="react-flow-id"]/div[1]/div[1]/div[1]/div/div[2]/div[2]/div/div[1]/div/div[1]/div/div/div[1]',
+ '//*[@id="react-flow-id"]/div[1]/div[1]/div[1]/div/div[2]/div[2]/div/div[1]/div/div[1]/div/div/div[1]'
)
.click();
await page.keyboard.press("Backspace");
@@ -71,7 +71,7 @@ test("LangflowShortcuts", async ({ page }) => {
expect(false).toBeTruthy();
}
- await page.getByTestId("title-Ollama").click();
+ await page.getByTestId("generic-node-title-arrangement").click();
await page.keyboard.press(`${control}+c`);
await page.getByTestId("title-Ollama").click();
@@ -84,7 +84,7 @@ test("LangflowShortcuts", async ({ page }) => {
await page
.locator(
- '//*[@id="react-flow-id"]/div[1]/div[1]/div[1]/div/div[2]/div[2]/div/div[1]/div/div[1]/div/div/div[1]',
+ '//*[@id="react-flow-id"]/div[1]/div[1]/div[1]/div/div[2]/div[2]/div/div[1]/div/div[1]/div/div/div[1]'
)
.click();
await page.keyboard.press("Backspace");
diff --git a/src/frontend/tests/end-to-end/logs.spec.ts b/src/frontend/tests/end-to-end/logs.spec.ts
index e430529ac..ba340d436 100644
--- a/src/frontend/tests/end-to-end/logs.spec.ts
+++ b/src/frontend/tests/end-to-end/logs.spec.ts
@@ -30,6 +30,8 @@ test("should able to see and interact with logs", async ({ page }) => {
await page.getByText("No Data Available", { exact: true }).isVisible();
await page.keyboard.press("Escape");
+ await page.getByText("Close").last().click();
+
await page
.getByTestId("popover-anchor-input-openai_api_key")
.fill(process.env.OPENAI_API_KEY ?? "");
diff --git a/src/frontend/tests/end-to-end/nestedComponent.spec.ts b/src/frontend/tests/end-to-end/nestedComponent.spec.ts
index 60e1a9fb9..a6906132f 100644
--- a/src/frontend/tests/end-to-end/nestedComponent.spec.ts
+++ b/src/frontend/tests/end-to-end/nestedComponent.spec.ts
@@ -41,7 +41,7 @@ test("NestedComponent", async ({ page }) => {
await page.locator('//*[@id="showpool_threads"]').click();
expect(
- await page.locator('//*[@id="showpool_threads"]').isChecked(),
+ await page.locator('//*[@id="showpool_threads"]').isChecked()
).toBeTruthy();
//showtext_key
@@ -53,141 +53,141 @@ test("NestedComponent", async ({ page }) => {
await page.locator('//*[@id="showindex_name"]').click();
expect(
- await page.locator('//*[@id="showindex_name"]').isChecked(),
+ await page.locator('//*[@id="showindex_name"]').isChecked()
).toBeFalsy();
// shownamespace
await page.locator('//*[@id="shownamespace"]').click();
expect(
- await page.locator('//*[@id="shownamespace"]').isChecked(),
+ await page.locator('//*[@id="shownamespace"]').isChecked()
).toBeFalsy();
// showpinecone_api_key
await page.locator('//*[@id="showpinecone_api_key"]').click();
expect(
- await page.locator('//*[@id="showpinecone_api_key"]').isChecked(),
+ await page.locator('//*[@id="showpinecone_api_key"]').isChecked()
).toBeFalsy();
// showindex_name
await page.locator('//*[@id="showindex_name"]').click();
expect(
- await page.locator('//*[@id="showindex_name"]').isChecked(),
+ await page.locator('//*[@id="showindex_name"]').isChecked()
).toBeTruthy();
// shownamespace
await page.locator('//*[@id="shownamespace"]').click();
expect(
- await page.locator('//*[@id="shownamespace"]').isChecked(),
+ await page.locator('//*[@id="shownamespace"]').isChecked()
).toBeTruthy();
// showpinecone_api_key
await page.locator('//*[@id="showpinecone_api_key"]').click();
expect(
- await page.locator('//*[@id="showpinecone_api_key"]').isChecked(),
+ await page.locator('//*[@id="showpinecone_api_key"]').isChecked()
).toBeTruthy();
// showindex_name
await page.locator('//*[@id="showindex_name"]').click();
expect(
- await page.locator('//*[@id="showindex_name"]').isChecked(),
+ await page.locator('//*[@id="showindex_name"]').isChecked()
).toBeFalsy();
// shownamespace
await page.locator('//*[@id="shownamespace"]').click();
expect(
- await page.locator('//*[@id="shownamespace"]').isChecked(),
+ await page.locator('//*[@id="shownamespace"]').isChecked()
).toBeFalsy();
// showpinecone_api_key
await page.locator('//*[@id="showpinecone_api_key"]').click();
expect(
- await page.locator('//*[@id="showpinecone_api_key"]').isChecked(),
+ await page.locator('//*[@id="showpinecone_api_key"]').isChecked()
).toBeFalsy();
// showindex_name
await page.locator('//*[@id="showindex_name"]').click();
expect(
- await page.locator('//*[@id="showindex_name"]').isChecked(),
+ await page.locator('//*[@id="showindex_name"]').isChecked()
).toBeTruthy();
// shownamespace
await page.locator('//*[@id="shownamespace"]').click();
expect(
- await page.locator('//*[@id="shownamespace"]').isChecked(),
+ await page.locator('//*[@id="shownamespace"]').isChecked()
).toBeTruthy();
// showpinecone_api_key
await page.locator('//*[@id="showpinecone_api_key"]').click();
expect(
- await page.locator('//*[@id="showpinecone_api_key"]').isChecked(),
+ await page.locator('//*[@id="showpinecone_api_key"]').isChecked()
).toBeTruthy();
// showindex_name
await page.locator('//*[@id="showindex_name"]').click();
expect(
- await page.locator('//*[@id="showindex_name"]').isChecked(),
+ await page.locator('//*[@id="showindex_name"]').isChecked()
).toBeFalsy();
// shownamespace
await page.locator('//*[@id="shownamespace"]').click();
expect(
- await page.locator('//*[@id="shownamespace"]').isChecked(),
+ await page.locator('//*[@id="shownamespace"]').isChecked()
).toBeFalsy();
// showpinecone_api_key
await page.locator('//*[@id="showpinecone_api_key"]').click();
expect(
- await page.locator('//*[@id="showpinecone_api_key"]').isChecked(),
+ await page.locator('//*[@id="showpinecone_api_key"]').isChecked()
).toBeFalsy();
// showindex_name
await page.locator('//*[@id="showindex_name"]').click();
expect(
- await page.locator('//*[@id="showindex_name"]').isChecked(),
+ await page.locator('//*[@id="showindex_name"]').isChecked()
).toBeTruthy();
// shownamespace
await page.locator('//*[@id="shownamespace"]').click();
expect(
- await page.locator('//*[@id="shownamespace"]').isChecked(),
+ await page.locator('//*[@id="shownamespace"]').isChecked()
).toBeTruthy();
// showpinecone_api_key
await page.locator('//*[@id="showpinecone_api_key"]').click();
expect(
- await page.locator('//*[@id="showpinecone_api_key"]').isChecked(),
+ await page.locator('//*[@id="showpinecone_api_key"]').isChecked()
).toBeTruthy();
//showpool_threads
await page.locator('//*[@id="showpool_threads"]').click();
expect(
- await page.locator('//*[@id="showpool_threads"]').isChecked(),
+ await page.locator('//*[@id="showpool_threads"]').isChecked()
).toBeFalsy();
//showtext_key
await page.locator('//*[@id="showtext_key"]').click();
expect(
- await page.locator('//*[@id="showtext_key"]').isChecked(),
+ await page.locator('//*[@id="showtext_key"]').isChecked()
).toBeTruthy();
- await page.locator('//*[@id="saveChangesBtn"]').click();
+ await page.getByText("Save Changes", { exact: true }).click();
});
diff --git a/src/frontend/tests/end-to-end/promptModalComponent.spec.ts b/src/frontend/tests/end-to-end/promptModalComponent.spec.ts
index 631d69549..b7c86ee97 100644
--- a/src/frontend/tests/end-to-end/promptModalComponent.spec.ts
+++ b/src/frontend/tests/end-to-end/promptModalComponent.spec.ts
@@ -138,7 +138,7 @@ test("PromptTemplateComponent", async ({ page }) => {
await page.locator('//*[@id="showtemplate"]').click();
expect(
- await page.locator('//*[@id="showtemplate"]').isChecked(),
+ await page.locator('//*[@id="showtemplate"]').isChecked()
).toBeTruthy();
await page.locator('//*[@id="showprompt"]').click();
@@ -158,13 +158,13 @@ test("PromptTemplateComponent", async ({ page }) => {
await page.locator('//*[@id="showtemplate"]').click();
expect(
- await page.locator('//*[@id="showtemplate"]').isChecked(),
+ await page.locator('//*[@id="showtemplate"]').isChecked()
).toBeTruthy();
await page.locator('//*[@id="showprompt"]').click();
expect(await page.locator('//*[@id="showprompt"]').isChecked()).toBeTruthy();
- await page.locator('//*[@id="saveChangesBtn"]').click();
+ await page.getByText("Save Changes", { exact: true }).click();
await page.getByTestId("more-options-modal").click();
await page.getByTestId("edit-button-modal").click();
diff --git a/src/frontend/tests/end-to-end/saveComponents.spec.ts b/src/frontend/tests/end-to-end/saveComponents.spec.ts
index 49ec8fd0b..163cb8dcd 100644
--- a/src/frontend/tests/end-to-end/saveComponents.spec.ts
+++ b/src/frontend/tests/end-to-end/saveComponents.spec.ts
@@ -26,8 +26,8 @@ test.describe("save component tests", () => {
// Read your file into a buffer.
const jsonContent = readFileSync(
- "tests/end-to-end/assets/flow_group_test.json",
- "utf-8",
+ "src/frontend/tests/end-to-end/assets/flow_group_test.json",
+ "utf-8"
);
// Create the DataTransfer and File
@@ -49,7 +49,7 @@ test.describe("save component tests", () => {
"drop",
{
dataTransfer,
- },
+ }
);
const genericNoda = page.getByTestId("div-generic-node");
diff --git a/src/frontend/tests/end-to-end/store.spec.ts b/src/frontend/tests/end-to-end/store.spec.ts
index ec526c0de..13963d698 100644
--- a/src/frontend/tests/end-to-end/store.spec.ts
+++ b/src/frontend/tests/end-to-end/store.spec.ts
@@ -96,27 +96,29 @@ test("should filter by type", async ({ page }) => {
await page.getByText("Website Content QA").isVisible();
await page.getByTestId("flows-button-store").click();
- await page.waitForTimeout(3000);
+ await page.waitForTimeout(8000);
let iconGroup = await page.getByTestId("icon-Group")?.count();
expect(iconGroup).not.toBe(0);
- await page.getByText("icon-ToyBrick").isHidden();
+ await page.getByText("icon-ToyBrick").last().isHidden();
await page.getByTestId("components-button-store").click();
- await page.waitForTimeout(3000);
+ await page.waitForTimeout(8000);
- await page.getByTestId("icon-Group").isHidden();
+ await page.getByTestId("icon-Group").last().isHidden();
let toyBrick = await page.getByTestId("icon-ToyBrick")?.count();
expect(toyBrick).not.toBe(0);
await page.getByTestId("all-button-store").click();
- await page.waitForTimeout(3000);
+ await page.waitForTimeout(8000);
- iconGroup = await page.getByTestId("icon-Group")?.count();
- toyBrick = await page.getByTestId("icon-ToyBrick")?.count();
+ let iconGroupAllCount = await page.getByTestId("icon-Group")?.count();
+ await page.waitForTimeout(2000);
+ let toyBrickAllCount = await page.getByTestId("icon-ToyBrick")?.count();
+ await page.waitForTimeout(2000);
- if (iconGroup === 0 || toyBrick === 0) {
+ if (iconGroupAllCount === 0 || toyBrickAllCount === 0) {
expect(false).toBe(true);
}
});
@@ -141,7 +143,6 @@ test("should add API-KEY", async ({ page }) => {
await page.waitForTimeout(2000);
await page.getByText("API Key Error").isVisible();
- await page.getByTestId("api-key-button-store").click();
await page
.getByPlaceholder("Insert your API Key")
.fill(process.env.STORE_API_KEY ?? "");
@@ -174,6 +175,9 @@ test("should like and add components and flows", async ({ page }) => {
await page.waitForTimeout(2000);
await page.getByText("API Key Error").isHidden();
+ await page.waitForTimeout(2000);
+
+ await page.getByTestId("button-store").click();
await page.waitForTimeout(5000);
const likedValue = await page
@@ -250,6 +254,7 @@ test("should share component with share button", async ({ page }) => {
.getByPlaceholder("Flow description")
.inputValue();
await page.getByText("Save").last().click();
+ await page.getByText("Close").last().click();
await page.getByTestId("icon-Share3").first().click();
await page.getByText("Name:").isVisible();
@@ -257,7 +262,7 @@ test("should share component with share button", async ({ page }) => {
await page.getByText("Set workflow status to public").isVisible();
await page
.getByText(
- "Attention: API keys in specified fields are automatically removed upon sharing.",
+ "Attention: API keys in specified fields are automatically removed upon sharing."
)
.isVisible();
await page.getByText("Export").first().isVisible();
diff --git a/src/frontend/tests/end-to-end/textInputOutput.spec.ts b/src/frontend/tests/end-to-end/textInputOutput.spec.ts
index b5aee8fba..88e6b9f08 100644
--- a/src/frontend/tests/end-to-end/textInputOutput.spec.ts
+++ b/src/frontend/tests/end-to-end/textInputOutput.spec.ts
@@ -60,7 +60,7 @@ test("TextInputOutputComponent", async ({ page }) => {
// Click and hold on the first element
await page
.locator(
- '//*[@id="react-flow-id"]/div/div[1]/div[1]/div/div[2]/div[1]/div/div[2]/div[6]/button/div/div',
+ '//*[@id="react-flow-id"]/div/div[1]/div[1]/div/div[2]/div[1]/div/div[2]/div[6]/button/div/div'
)
.hover();
await page.mouse.down();
@@ -68,7 +68,7 @@ test("TextInputOutputComponent", async ({ page }) => {
// Move to the second element
await page
.locator(
- '//*[@id="react-flow-id"]/div/div[1]/div[1]/div/div[2]/div[2]/div/div[2]/div[9]/div/button/div/div',
+ '//*[@id="react-flow-id"]/div/div[1]/div[1]/div/div[2]/div[2]/div/div[2]/div[9]/div/button/div/div'
)
.hover();
@@ -92,7 +92,7 @@ test("TextInputOutputComponent", async ({ page }) => {
// Click and hold on the first element
await page
.locator(
- '//*[@id="react-flow-id"]/div/div[1]/div[1]/div/div[2]/div[2]/div/div[2]/div[13]/button/div/div',
+ '//*[@id="react-flow-id"]/div/div[1]/div[1]/div/div[2]/div[2]/div/div[2]/div[13]/button/div/div'
)
.hover();
await page.mouse.down();
@@ -100,7 +100,7 @@ test("TextInputOutputComponent", async ({ page }) => {
// Move to the second element
await page
.locator(
- '//*[@id="react-flow-id"]/div/div[1]/div[1]/div/div[2]/div[3]/div/div[2]/div[3]/div/button/div/div',
+ '//*[@id="react-flow-id"]/div/div[1]/div[1]/div/div[2]/div[3]/div/div[2]/div[3]/div/button/div/div'
)
.hover();
@@ -132,11 +132,13 @@ test("TextInputOutputComponent", async ({ page }) => {
await page.getByText("Outputs", { exact: true }).nth(1).click();
await page.getByText("Text Output", { exact: true }).nth(2).click();
- let contentOutput = await page.getByPlaceholder("Empty").inputValue();
+ let contentOutput = await page.getByPlaceholder("Enter text...").inputValue();
expect(contentOutput).not.toBe(null);
await page.keyboard.press("Escape");
+ await page.getByText("Close", { exact: true }).last().click();
+
await page
.getByTestId("popover-anchor-input-input_value")
.nth(0)
@@ -151,6 +153,6 @@ test("TextInputOutputComponent", async ({ page }) => {
await page.getByText("Outputs", { exact: true }).nth(1).click();
await page.getByText("Text Output", { exact: true }).nth(2).click();
- contentOutput = await page.getByPlaceholder("Empty").inputValue();
+ contentOutput = await page.getByPlaceholder("Enter text...").inputValue();
expect(contentOutput).not.toBe(null);
});
diff --git a/src/frontend/tests/end-to-end/toggleComponent.spec.ts b/src/frontend/tests/end-to-end/toggleComponent.spec.ts
index 63c4d473b..cbe77f1e3 100644
--- a/src/frontend/tests/end-to-end/toggleComponent.spec.ts
+++ b/src/frontend/tests/end-to-end/toggleComponent.spec.ts
@@ -45,10 +45,10 @@ test("ToggleComponent", async ({ page }) => {
await page.locator('//*[@id="showload_hidden"]').click();
expect(
- await page.locator('//*[@id="showload_hidden"]').isChecked(),
+ await page.locator('//*[@id="showload_hidden"]').isChecked()
).toBeTruthy();
- await page.locator('//*[@id="saveChangesBtn"]').click();
+ await page.getByText("Save Changes", { exact: true }).click();
await page.getByTitle("fit view").click();
@@ -81,12 +81,12 @@ test("ToggleComponent", async ({ page }) => {
await page.locator('//*[@id="showload_hidden"]').click();
expect(
- await page.locator('//*[@id="showload_hidden"]').isChecked(),
+ await page.locator('//*[@id="showload_hidden"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showmax_concurrency"]').click();
expect(
- await page.locator('//*[@id="showmax_concurrency"]').isChecked(),
+ await page.locator('//*[@id="showmax_concurrency"]').isChecked()
).toBeTruthy();
await page.locator('//*[@id="showpath"]').click();
@@ -94,22 +94,22 @@ test("ToggleComponent", async ({ page }) => {
await page.locator('//*[@id="showrecursive"]').click();
expect(
- await page.locator('//*[@id="showrecursive"]').isChecked(),
+ await page.locator('//*[@id="showrecursive"]').isChecked()
).toBeTruthy();
await page.locator('//*[@id="showsilent_errors"]').click();
expect(
- await page.locator('//*[@id="showsilent_errors"]').isChecked(),
+ await page.locator('//*[@id="showsilent_errors"]').isChecked()
).toBeTruthy();
await page.locator('//*[@id="showuse_multithreading"]').click();
expect(
- await page.locator('//*[@id="showuse_multithreading"]').isChecked(),
+ await page.locator('//*[@id="showuse_multithreading"]').isChecked()
).toBeTruthy();
await page.locator('//*[@id="showmax_concurrency"]').click();
expect(
- await page.locator('//*[@id="showmax_concurrency"]').isChecked(),
+ await page.locator('//*[@id="showmax_concurrency"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showpath"]').click();
@@ -117,20 +117,20 @@ test("ToggleComponent", async ({ page }) => {
await page.locator('//*[@id="showrecursive"]').click();
expect(
- await page.locator('//*[@id="showrecursive"]').isChecked(),
+ await page.locator('//*[@id="showrecursive"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showsilent_errors"]').click();
expect(
- await page.locator('//*[@id="showsilent_errors"]').isChecked(),
+ await page.locator('//*[@id="showsilent_errors"]').isChecked()
).toBeFalsy();
await page.locator('//*[@id="showuse_multithreading"]').click();
expect(
- await page.locator('//*[@id="showuse_multithreading"]').isChecked(),
+ await page.locator('//*[@id="showuse_multithreading"]').isChecked()
).toBeFalsy();
- await page.locator('//*[@id="saveChangesBtn"]').click();
+ await page.getByText("Save Changes", { exact: true }).click();
const plusButtonLocator = page.getByTestId("toggle-load_hidden");
const elementCount = await plusButtonLocator?.count();
@@ -144,38 +144,38 @@ test("ToggleComponent", async ({ page }) => {
await page.locator('//*[@id="showload_hidden"]').click();
expect(
- await page.locator('//*[@id="showload_hidden"]').isChecked(),
+ await page.locator('//*[@id="showload_hidden"]').isChecked()
).toBeTruthy();
expect(
- await page.getByTestId("toggle-edit-load_hidden").isChecked(),
+ await page.getByTestId("toggle-edit-load_hidden").isChecked()
).toBeTruthy();
- await page.locator('//*[@id="saveChangesBtn"]').click();
+ await page.getByText("Save Changes", { exact: true }).click();
await page.getByTestId("toggle-load_hidden").click();
expect(
- await page.getByTestId("toggle-load_hidden").isChecked(),
+ await page.getByTestId("toggle-load_hidden").isChecked()
).toBeFalsy();
await page.getByTestId("toggle-load_hidden").click();
expect(
- await page.getByTestId("toggle-load_hidden").isChecked(),
+ await page.getByTestId("toggle-load_hidden").isChecked()
).toBeTruthy();
await page.getByTestId("toggle-load_hidden").click();
expect(
- await page.getByTestId("toggle-load_hidden").isChecked(),
+ await page.getByTestId("toggle-load_hidden").isChecked()
).toBeFalsy();
await page.getByTestId("toggle-load_hidden").click();
expect(
- await page.getByTestId("toggle-load_hidden").isChecked(),
+ await page.getByTestId("toggle-load_hidden").isChecked()
).toBeTruthy();
await page.getByTestId("toggle-load_hidden").click();
expect(
- await page.getByTestId("toggle-load_hidden").isChecked(),
+ await page.getByTestId("toggle-load_hidden").isChecked()
).toBeFalsy();
}
});