feature: get messages from messages table for the playground (#3874)

* refactor: Update MessageBase text attribute based on isinstance check.

* feat: Add update_message function to update a message in the database.

* refactor(chat): Update imports and remove unnecessary config method in ChatComponent.

* refactor: Add stream_message method to ChatComponent.

* refactor: Update method call in ChatOutput component.

* feat: Add callback function to custom component and update build_results signature.

* feat: Add callback parameter to instantiate_class function.

* feat(graph): Add callback functions for sync and async operations.

* feat: Add callback function support to vertex build process.

* feat: Add handling for added message in InterfaceVertex class.

* feat: Add callback support to Graph methods.

* feat(chat): Add callback function to build_vertices function.

* refactor: Simplify update_message function and use session_scope for session management.

* fix: Call set_callback method if available on custom component.

* refactor(chat): Update chat message chunk handling and ID conversion.

* feat: Add null check before setting cache in build_vertex_stream function.

* refactor: Fix send_event_wrapper function and add callback parameter to _build_vertex function.

* refactor: Simplify conditional statement and import order in ChatOutput.

* [autofix.ci] apply automated fixes

* refactor: move log method to Component class.

* refactor: Simplify CallbackFunction definition.

* feat: Initialize _current_output attribute in Component class.

* feat: store current output name in custom component during processing.

* feat: Add current output and component ID to log data.

* fix: Add condition to check current output before invoking callback.

* refactor: Update callback to log_callback in graph methods.

* feat: Add test for callback graph execution with log messages.

* update projects

* fix(chat.py): fix condition to check if message text is a string before updating message text in the database

* refactor(ChatOutput.py): update ChatOutput class to correctly store and assign the message value to ensure consistency and avoid potential bugs

* refactor(chat.py): update return type of store_message method to return a single Message object instead of a list of Messages
refactor(chat.py): update logic to correctly handle updating and returning a single stored message object instead of a list of messages

* update starter projects

* refactor(component.py): update type hint for name parameter in log method to be more explicit

* feat: Add EventManager class for managing events and event registration

* refactor: Update log_callback to event_manager in custom component classes

* refactor(component.py): rename _log_callback to _event_manager and update method call to on_log for better clarity and consistency

* refactor(chat.py): rename _log_callback method to _event_manager.on_token for clarity and consistency in method naming

* refactor: Rename log_callback to event_manager for clarity and consistency

* refactor: Update Vertex class to use EventManager instead of log_callback for better clarity and consistency

* refactor: update build_flow to use EventManager

* refactor: Update EventManager class to use Protocol for event callbacks

* if event_type is not passed, it uses the default send_event

* Add method to register event functions in EventManager

- Introduced `register_event_function` method to allow passing custom event functions.
- Updated `noop` method to accept `event_type` parameter.
- Adjusted `__getattr__` to return `EventCallback` type.

* update test_callback_graph

* Add unit tests for EventManager in test_event_manager.py

- Added tests for event registration, including default event type, empty string names, and specific event types.
- Added tests for custom event functions and unregistered event access.
- Added tests for event sending, including JSON formatting, empty data, and large payloads.
- Added tests for handling JSON serialization errors and the noop function.

* feat: Add callback function support to vertex build process.

* feat: Add callback support to Graph methods.

* feat(chat): Add callback function to build_vertices function.

* [autofix.ci] apply automated fixes

* refactor: Update callback to log_callback in graph methods.

* fetching data from messages and builds at the same time, need to remove duplicates

* refactor: Sort chat history by timestamp in ChatView component

* fix: update serialization and improve error handling (#3516)

* feat(utils): add support for V1BaseModel in serialize_field

Add support for V1BaseModel instances in the serialize_field function by
checking for a "to_json" method. If the method is not present, return the
attribute values as a dictionary.

* refactor: Update field serializer function and error handling in build_flow function

* remove use memo to prevent bugs

* feat: add updateMessagePartial method to MessagesStoreType

* feat: update message partially in MessagesStoreType

This commit adds the `updateMessagePartial` method to the `MessagesStoreType` in `messagesStore.ts`. This method allows updating a specific message by merging the changes with the existing message object.

* feat: add log callback for start message in ChatComponent

* feat: update log_callback name

* feat: add log_callback for message in ChatComponent that are not streaming

* refactor: remove console.log statement in buildFlowVertices function

* refactor: store message in ChatInput after updating flow_id

This commit refactors the `ChatInput` component by moving the logic to store the message after updating the `flow_id` property. This ensures that the message is properly stored in the correct flow. The previous implementation had the logic to store the message before updating the `flow_id`, which could lead to incorrect storage of messages. This change improves the reliability and accuracy of message storage in the `ChatInput` component.

* refactor: move message storage logic in ChatInput after updating flow_id

* refactor: update ChatComponent to use stored_message.id instead of self.graph.flow_id

Update the `ChatComponent` class in `chat.py` to use the `stored_message.id` property instead of `self.graph.flow_id` when logging a message. This ensures that the correct message ID is used for logging purposes. The previous implementation used the flow ID, which could lead to incorrect logging. This change improves the accuracy of message logging in the `ChatComponent`.

* refactor: remove unused code and console.log statements

* raw: temp serializer fix

* streaming working but the message comes in one shot

* refactor: optimize message update in useMessagesStore

Improve the efficiency of updating messages in the `useMessagesStore` function of `messagesStore.ts`. Instead of iterating through the entire message list, this refactor searches for the message to update by iterating backwards from the end. This approach allows for faster message retrieval and update. The code has been modified to use a for loop and break out of the loop once the message is found. This change enhances the performance of the message update process.

* Refactor `serialize_flow_id` method to correctly handle UUID serialization in `message.py`

* Refactor `send_event` method to use `jsonable_encoder` for data serialization

* refactor: optimize message update in useMessagesStore

* streaming working with timeout

* refactor: update buildUtils.ts to use data instead of data.data in addMessage function

* version with reactState for chatHistory

* refactor: update on_message method in ChatComponent

* refactor: update on_message method in ChatComponent

* refactor: Remove unused dependency in package-lock.json

* Refactor chatView component and add hiddenSession prop

* Refactor chatView component and update hiddenSessions prop

* Refactor chatView component to use visibleSessions prop instead of hiddenSessions

* Refactor IOModal component to remove redundant code

* Refactor chatView component to include focusChat prop

* Refactor chatView component to include focusChat prop and trigger focus on chat when new session is set

* Refactor IOModal component to update visible sessions when new session is added

* feat: Add session parameter to buildFlowVertices function

* feat: Add someFlowTemplateFields function

Add the someFlowTemplateFields function to the reactflowUtils module. This function checks if any of the nodes in the provided array have template fields that pass a given validation function.

* feat: Add session parameter to buildFlowVertices function

* feat: Add session parameter to buildFlowVertices function

* update Session logic on ioModal

* Refactor ChatView component: Remove unused eraser button

The eraser button in the ChatView component was removed as it was not being used and served no purpose. This change improves code cleanliness and removes unnecessary code.

* Refactor Vertex class: Inject session_id if provided in inputs

* Refactor build_flow function: Set default session if inputs are empty

* Refactor InputValueRequest schema: Add session parameter

* Refactor IOModal component: Update session logic

* Refactor buildFlowVertices function: Update input handling

* Refactor MessagesStoreType in zustand/messages/index.ts: Remove unused columns property and setColumns method

* Refactor MessagesStoreType: Remove unused columns property and setColumns method

* Refactor SessionView component: Update columns extraction logic

* Refactor ChatView component: Remove unused variables

* Refactor useGetMessagesQuery: Remove unused setColumns method

* Refactor RenderIcons component: Set default value for filteredShortcut prop to prevent bug

* create edit message component for chat view

* Refactor useUpdateMessage: Add refetch option to trigger query refetch

* Refactor IOModal component: Remove unused variables and update useGetMessagesQuery

* Refactor ChatView component: Add session ID to message object

* update chat message to handle message edit

* update types

* fix: Update API call to send entire message object

* Refactor EditMessageField component: Add timeout to onBlur event

* Refactor EditMessageField component: Update layout of edit message field

* create migration

* add fields to data table

* feat: Add "edit" flag to message_dict in update_message API endpoint

* Refactor EditMessageField component: Improve onBlur event handling and add button click flag

* Refactor code to include "edit" flag in message types

* feat: Add EditMessageButton component for editing chat messages

* Refactor ChatMessage component: Add EditMessageButton and improve layout

* fix: Add refetch query for current flow messages not all flows

* Refactor ChatMessage component: Add ShadTooltip for EditMessageButton

* add info into edit message field

* fix: migrate

* fix running chat input directly from the flow

* [autofix.ci] apply automated fixes

* fix edit flag

* Refactor IOModal component to generate a unique session ID based on the current date and time

* [autofix.ci] apply automated fixes

* Refactor IOModal component to improve session management and interaction

* [autofix.ci] apply automated fixes

* Refactor sessionSelector component to improve session management and interaction

* chore: Refactor sessionSelector component to improve session management and interaction

* [autofix.ci] apply automated fixes

* create mutation to handle session rename

* refactor: Rename useUpdateSession to useUpdateSessionName for clarity

* [autofix.ci] apply automated fixes

* Refactor sessionSelector component for improved session management and interaction

* Refactor sessionSelector component to update visible session on session name change

* [autofix.ci] apply automated fixes

* add message related events back

* chore: Add console logs for debugging in buildFlowVertices function

* Refactor IOModal component to update tab trigger label from "Memories" to "Chat"

* improve edit name feature

* Refactor IOModal component button label to "New Chat"

* Refactor sessionSelector component to improve session management and interaction

* Refactor IOModal component to remove unused code and improve session management

* fix typing error

* fix run chat input on component level

* prevent toogle visibility on session menu

* fix bug on rename session while in table view mode

* chore: Update setSelectedView prop type in sessionSelector component

* add first test version not working yet

* fix bug for renaming and deleting session

* refactor: Update sessionSelector component to handle session changes

* improve test

* fix rename session multiple session bugs

* change visible session from array to string

* chore: Update editMessageField component to include margin-right for text span

* [autofix.ci] apply automated fixes

* Update down_revision in Alembic migration script

* Refactor IOModal component to simplify session visibility handling

* Fix comparison operator for filtering error messages in memory.py

* Refactor ChatInput to conditionally store and update messages

* Refactor JSON formatting for improved readability in starter projects

* Add type casting for message_text and import cast from typing module

* Refactor input handling to use direct dictionary access for 'session' and 'input_value' keys

* Allow `update_message` to accept `str` type for `message_id` parameter

* ⬆️ (pyproject.toml): upgrade duckduckgo-search dependency to version 6.3.1 for bug fixes or new features
🔧 (duckduckgo.spec.ts): refactor test to handle multiple possible outcomes when waiting for selectors and improve readability

* Refactor test file: generalBugs-shard-0.spec.ts

* Refactor test file: freeze.spec.ts

* Refactor test files: update element selectors and actions

* Refactor test file: chatInputOutput.spec.ts

* [autofix.ci] apply automated fixes

* Refactor chatMessage component to handle different types of children content on code modal

* [autofix.ci] apply automated fixes

---------

Co-authored-by: Gabriel Luiz Freitas Almeida <gabriel@langflow.org>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: italojohnny <italojohnnydosanjos@gmail.com>
Co-authored-by: cristhianzl <cristhian.lousa@gmail.com>
This commit is contained in:
anovazzi1 2024-10-15 16:09:12 -03:00 committed by GitHub
commit 908c141d97
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
58 changed files with 3782 additions and 1311 deletions

View file

@ -105,7 +105,7 @@ dependencies = [
"jq>=1.8.0",
"pydantic-settings==2.4.0",
"ragstack-ai-knowledge-store>=0.2.1",
"duckduckgo-search>=6.3.0",
"duckduckgo-search>=6.3.1",
"langchain-elasticsearch>=0.2.0",
"opensearch-py>=2.7.1",
]

View file

@ -0,0 +1,49 @@
"""Add error and edit flags to message
Revision ID: eb5e72293a8e
Revises: 5ace73a7f223
Create Date: 2024-09-19 16:18:50.828648
"""
from typing import Sequence, Union
import sqlalchemy as sa
from alembic import op
from sqlalchemy.engine.reflection import Inspector
# revision identifiers, used by Alembic.
revision: str = "eb5e72293a8e"
down_revision: Union[str, None] = "5ace73a7f223"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
conn = op.get_bind()
inspector = Inspector.from_engine(conn) # type: ignore
table_names = inspector.get_table_names() # noqa
column_names = [column["name"] for column in inspector.get_columns("message")]
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table("message", schema=None) as batch_op:
if "error" not in column_names:
batch_op.add_column(sa.Column("error", sa.Boolean(), nullable=False, server_default=sa.false()))
if "edit" not in column_names:
batch_op.add_column(sa.Column("edit", sa.Boolean(), nullable=False, server_default=sa.false()))
# ### end Alembic commands ###
def downgrade() -> None:
conn = op.get_bind()
inspector = Inspector.from_engine(conn) # type: ignore
table_names = inspector.get_table_names() # noqa
column_names = [column["name"] for column in inspector.get_columns("message")]
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table("message", schema=None) as batch_op:
if "edit" in column_names:
batch_op.drop_column("edit")
if "error" in column_names:
batch_op.drop_column("error")
# ### end Alembic commands ###

View file

@ -155,6 +155,9 @@ async def build_flow(
telemetry_service: TelemetryService = Depends(get_telemetry_service),
session=Depends(get_session),
):
if not inputs:
inputs = InputValueRequest(session=str(flow_id))
async def build_graph_and_get_order() -> tuple[list[str], list[str], Graph]:
start_time = time.perf_counter()
components_count = None

View file

@ -99,6 +99,7 @@ async def update_message(
try:
message_dict = message.model_dump(exclude_unset=True, exclude_none=True)
message_dict["edit"] = True
db_message.sqlmodel_update(message_dict)
session.add(db_message)
session.commit()

View file

@ -297,6 +297,7 @@ class VerticesBuiltResponse(BaseModel):
class InputValueRequest(BaseModel):
components: list[str] | None = []
input_value: str | None = None
session: str | None = None
type: InputType | None = Field(
"any",
description="Defines on which components the input value should be applied. "
@ -310,9 +311,12 @@ class InputValueRequest(BaseModel):
{
"components": ["components_id", "Component Name"],
"input_value": "input_value",
"session": "session_id",
},
{"components": ["Component Name"], "input_value": "input_value"},
{"input_value": "input_value"},
{"components": ["Component Name"], "input_value": "input_value", "session": "session_id"},
{"input_value": "input_value", "session": "session_id"},
{"type": "chat", "input_value": "input_value"},
{"type": "json", "input_value": '{"key": "value"}'},
]

View file

@ -28,7 +28,7 @@ response = requests.get(url)
data = json.loads(response.text)
# Extract the model names into a Python list
litellm_model_names = [model for model, _ in data.items() if model != "sample_spec"]
litellm_model_names = [model for model in data if model != "sample_spec"]
# To store the class names that extend ToolInterface

View file

@ -1,4 +1,5 @@
from collections.abc import AsyncIterator, Iterator
from typing import cast
from langflow.custom import Component
from langflow.memory import store_message
@ -12,43 +13,52 @@ class ChatComponent(Component):
display_name = "Chat Component"
description = "Use as base for chat components."
# Keep this method for backward compatibility
def store_message(
self,
message: Message,
) -> Message:
messages = store_message(
message,
flow_id=self.graph.flow_id,
)
if len(messages) > 1:
def store_message(self, message: Message) -> Message:
messages = store_message(message, flow_id=self.graph.flow_id)
if len(messages) != 1:
msg = "Only one message can be stored at a time."
raise ValueError(msg)
stored_message = messages[0]
if (
hasattr(self, "_event_manager")
and self._event_manager
and stored_message.id
and not isinstance(message.text, str)
):
self._send_message_event(stored_message)
if self._should_stream_message(stored_message, message):
complete_message = self._stream_message(message, stored_message.id)
message_table = update_message(message_id=stored_message.id, message={"text": complete_message})
stored_message = Message(**message_table.model_dump())
self.vertex._added_message = stored_message
stored_message = self._update_stored_message(stored_message.id, complete_message)
self.status = stored_message
return stored_message
def _send_message_event(self, message: Message):
if hasattr(self, "_event_manager") and self._event_manager:
self._event_manager.on_message(data=message.data)
def _should_stream_message(self, stored_message: Message, original_message: Message) -> bool:
return bool(
hasattr(self, "_event_manager")
and self._event_manager
and stored_message.id
and not isinstance(original_message.text, str)
)
def _update_stored_message(self, message_id: str, complete_message: str) -> Message:
message_table = update_message(message_id=message_id, message={"text": complete_message})
updated_message = Message(**message_table.model_dump())
self.vertex._added_message = updated_message
return updated_message
def _process_chunk(self, chunk: str, complete_message: str, message: Message, message_id: str) -> str:
complete_message += chunk
data = {
"text": complete_message,
"chunk": chunk,
"sender": message.sender,
"sender_name": message.sender_name,
"id": str(message_id),
}
if self._event_manager:
self._event_manager.on_token(data=data)
self._event_manager.on_token(
data={
"text": complete_message,
"chunk": chunk,
"sender": message.sender,
"sender_name": message.sender_name,
"id": str(message_id),
}
)
return complete_message
async def _handle_async_iterator(self, iterator: AsyncIterator, message: Message, message_id: str) -> str:
@ -69,7 +79,6 @@ class ChatComponent(Component):
complete_message = ""
for chunk in iterator:
complete_message = self._process_chunk(chunk.content, complete_message, message, message_id)
return complete_message
def build_with_data(
@ -80,22 +89,25 @@ class ChatComponent(Component):
input_value: str | Data | Message | None = None,
files: list[str] | None = None,
session_id: str | None = None,
return_message: bool | None = False,
) -> Message:
if isinstance(input_value, Data):
# Update the data of the record
message = Message.from_data(input_value)
else:
message = Message(
text=input_value, sender=sender, sender_name=sender_name, files=files, session_id=session_id
)
return_message: bool = False,
) -> str | Message:
message = self._create_message(input_value, sender, sender_name, files, session_id)
message_text = message.text if not return_message else message
self.status = message_text
if session_id and isinstance(message, Message) and isinstance(message.text, str):
messages = store_message(
message,
flow_id=self.graph.flow_id,
)
messages = store_message(message, flow_id=self.graph.flow_id)
self.status = messages
return message_text # type: ignore[return-value]
self._send_messages_events(messages)
return cast(str | Message, message_text)
def _create_message(self, input_value, sender, sender_name, files, session_id) -> Message:
if isinstance(input_value, Data):
return Message.from_data(input_value)
return Message(text=input_value, sender=sender, sender_name=sender_name, files=files, session_id=session_id)
def _send_messages_events(self, messages):
if hasattr(self, "_event_manager") and self._event_manager:
for stored_message in messages:
self._event_manager.on_message(data=stored_message.data)

View file

@ -2,7 +2,6 @@ from langflow.base.data.utils import IMG_FILE_TYPES, TEXT_FILE_TYPES
from langflow.base.io.chat import ChatComponent
from langflow.inputs import BoolInput
from langflow.io import DropdownInput, FileInput, MessageTextInput, MultilineInput, Output
from langflow.memory import store_message
from langflow.schema.message import Message
from langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_USER, MESSAGE_SENDER_USER
@ -69,18 +68,12 @@ class ChatInput(ChatComponent):
session_id=self.session_id,
files=self.files,
)
if (
self.session_id
and isinstance(message, Message)
and isinstance(message.text, str)
and self.should_store_message
):
store_message(
if self.session_id and isinstance(message, Message) and self.should_store_message:
stored_message = self.store_message(
message,
flow_id=self.graph.flow_id,
)
self.message.value = message
self.message.value = stored_message
message = stored_message
self.status = message
return message

View file

@ -1,7 +1,6 @@
from langflow.base.io.chat import ChatComponent
from langflow.inputs import BoolInput
from langflow.io import DropdownInput, MessageTextInput, Output
from langflow.memory import store_message
from langflow.schema.message import Message
from langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_AI, MESSAGE_SENDER_USER
@ -65,17 +64,12 @@ class ChatOutput(ChatComponent):
sender_name=self.sender_name,
session_id=self.session_id,
)
if (
self.session_id
and isinstance(message, Message)
and isinstance(message.text, str)
and self.should_store_message
):
store_message(
if self.session_id and isinstance(message, Message) and self.should_store_message:
stored_message = self.store_message(
message,
flow_id=self.graph.flow_id,
)
self.message.value = message
self.message.value = stored_message
message = stored_message
self.status = message
return message

View file

@ -5,6 +5,7 @@ import time
import uuid
from functools import partial
from fastapi.encoders import jsonable_encoder
from typing_extensions import Protocol
from langflow.schema.log import LoggableType
@ -52,7 +53,8 @@ class EventManager:
self.events[name] = _callback
def send_event(self, *, event_type: str, data: LoggableType):
json_data = {"event": event_type, "data": data}
jsonable_data = jsonable_encoder(data)
json_data = {"event": event_type, "data": jsonable_data}
event_id = uuid.uuid4()
str_data = json.dumps(json_data) + "\n\n"
self.queue.put_nowait((event_id, str_data.encode("utf-8"), time.time()))

View file

@ -793,11 +793,20 @@ class Vertex:
# and we are just getting the result for the requester
return await self.get_requester_result(requester)
self._reset()
# inject session_id if it is not None
if inputs is not None and "session" in inputs and inputs["session"] is not None and self.has_session_id:
session_id_value = self.get_value_from_template_dict("session_id")
if session_id_value == "":
self.update_raw_params({"session_id": inputs["session"]}, overwrite=True)
if self._is_chat_input() and (inputs or files):
chat_input = {}
if inputs:
chat_input.update({"input_value": inputs.get(INPUT_FIELD_NAME, "")})
if (
inputs
and isinstance(inputs, dict)
and "input_value" in inputs
and inputs["input_value"] is not None
):
chat_input.update({"input_value": inputs[INPUT_FIELD_NAME]})
if files:
chat_input.update({"files": files})

View file

@ -8,12 +8,16 @@
"dataType": "ToolCallingAgent",
"id": "ToolCallingAgent-mf0BN",
"name": "response",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "input_value",
"id": "ChatOutput-Ag9YG",
"inputTypes": ["Message"],
"inputTypes": [
"Message"
],
"type": "str"
}
},
@ -30,12 +34,16 @@
"dataType": "OpenAIModel",
"id": "OpenAIModel-1ioeW",
"name": "model_output",
"output_types": ["LanguageModel"]
"output_types": [
"LanguageModel"
]
},
"targetHandle": {
"fieldName": "llm",
"id": "ToolCallingAgent-mf0BN",
"inputTypes": ["LanguageModel"],
"inputTypes": [
"LanguageModel"
],
"type": "other"
}
},
@ -52,12 +60,17 @@
"dataType": "CalculatorTool",
"id": "CalculatorTool-Nb4P5",
"name": "api_build_tool",
"output_types": ["Tool"]
"output_types": [
"Tool"
]
},
"targetHandle": {
"fieldName": "tools",
"id": "ToolCallingAgent-mf0BN",
"inputTypes": ["Tool", "BaseTool"],
"inputTypes": [
"Tool",
"BaseTool"
],
"type": "other"
}
},
@ -74,12 +87,16 @@
"dataType": "ChatInput",
"id": "ChatInput-X3ARP",
"name": "message",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "input_value",
"id": "ToolCallingAgent-mf0BN",
"inputTypes": ["Message"],
"inputTypes": [
"Message"
],
"type": "str"
}
},
@ -95,12 +112,17 @@
"dataType": "PythonREPLTool",
"id": "PythonREPLTool-i922a",
"name": "api_build_tool",
"output_types": ["Tool"]
"output_types": [
"Tool"
]
},
"targetHandle": {
"fieldName": "tools",
"id": "ToolCallingAgent-mf0BN",
"inputTypes": ["Tool", "BaseTool"],
"inputTypes": [
"Tool",
"BaseTool"
],
"type": "other"
}
},
@ -116,7 +138,9 @@
"data": {
"id": "ChatInput-X3ARP",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -144,7 +168,9 @@
"method": "message_response",
"name": "message",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -167,7 +193,7 @@
"show": true,
"title_case": false,
"type": "code",
"value": "from langflow.base.data.utils import IMG_FILE_TYPES, TEXT_FILE_TYPES\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, FileInput, MessageTextInput, MultilineInput, Output\nfrom langflow.memory import store_message\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_USER, MESSAGE_SENDER_USER\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n name = \"ChatInput\"\n\n inputs = [\n MultilineInput(\n name=\"input_value\",\n display_name=\"Text\",\n value=\"\",\n info=\"Message to be passed as input.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_USER,\n info=\"Type of sender.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_USER,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n FileInput(\n name=\"files\",\n display_name=\"Files\",\n file_types=TEXT_FILE_TYPES + IMG_FILE_TYPES,\n info=\"Files to be sent with the message.\",\n advanced=True,\n is_list=True,\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n files=self.files,\n )\n\n if (\n self.session_id\n and isinstance(message, Message)\n and isinstance(message.text, str)\n and self.should_store_message\n ):\n store_message(\n message,\n flow_id=self.graph.flow_id,\n )\n self.message.value = message\n\n self.status = message\n return message\n"
"value": "from langflow.base.data.utils import IMG_FILE_TYPES, TEXT_FILE_TYPES\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, FileInput, MessageTextInput, MultilineInput, Output\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_USER, MESSAGE_SENDER_USER\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n name = \"ChatInput\"\n\n inputs = [\n MultilineInput(\n name=\"input_value\",\n display_name=\"Text\",\n value=\"\",\n info=\"Message to be passed as input.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_USER,\n info=\"Type of sender.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_USER,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n FileInput(\n name=\"files\",\n display_name=\"Files\",\n file_types=TEXT_FILE_TYPES + IMG_FILE_TYPES,\n info=\"Files to be sent with the message.\",\n advanced=True,\n is_list=True,\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n files=self.files,\n )\n if self.session_id and isinstance(message, Message) and self.should_store_message:\n stored_message = self.store_message(\n message,\n )\n self.message.value = stored_message\n message = stored_message\n\n self.status = message\n return message\n"
},
"files": {
"_input_type": "FileInput",
@ -217,7 +243,9 @@
"display_name": "Text",
"dynamic": false,
"info": "Message to be passed as input.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -239,7 +267,10 @@
"dynamic": false,
"info": "Type of sender.",
"name": "sender",
"options": ["Machine", "User"],
"options": [
"Machine",
"User"
],
"placeholder": "",
"required": false,
"show": true,
@ -254,7 +285,9 @@
"display_name": "Sender Name",
"dynamic": false,
"info": "Name of the sender.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "sender_name",
@ -273,7 +306,9 @@
"display_name": "Session ID",
"dynamic": false,
"info": "The session ID of the chat. If empty, the current session ID parameter will be used.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "session_id",
@ -325,7 +360,9 @@
"data": {
"id": "ChatOutput-Ag9YG",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -353,7 +390,9 @@
"method": "message_response",
"name": "message",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -376,7 +415,7 @@
"show": true,
"title_case": false,
"type": "code",
"value": "from langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, MessageTextInput, Output\nfrom langflow.memory import store_message\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_AI, MESSAGE_SENDER_USER\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n name = \"ChatOutput\"\n\n inputs = [\n MessageTextInput(\n name=\"input_value\",\n display_name=\"Text\",\n info=\"Message to be passed as output.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_AI,\n advanced=True,\n info=\"Type of sender.\",\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_AI,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"data_template\",\n display_name=\"Data Template\",\n value=\"{text}\",\n advanced=True,\n info=\"Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.\",\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n )\n if (\n self.session_id\n and isinstance(message, Message)\n and isinstance(message.text, str)\n and self.should_store_message\n ):\n store_message(\n message,\n flow_id=self.graph.flow_id,\n )\n self.message.value = message\n\n self.status = message\n return message\n"
"value": "from langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, MessageTextInput, Output\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_AI, MESSAGE_SENDER_USER\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n name = \"ChatOutput\"\n\n inputs = [\n MessageTextInput(\n name=\"input_value\",\n display_name=\"Text\",\n info=\"Message to be passed as output.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_AI,\n advanced=True,\n info=\"Type of sender.\",\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_AI,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"data_template\",\n display_name=\"Data Template\",\n value=\"{text}\",\n advanced=True,\n info=\"Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.\",\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n )\n if self.session_id and isinstance(message, Message) and self.should_store_message:\n stored_message = self.store_message(\n message,\n )\n self.message.value = stored_message\n message = stored_message\n\n self.status = message\n return message\n"
},
"data_template": {
"_input_type": "MessageTextInput",
@ -384,7 +423,9 @@
"display_name": "Data Template",
"dynamic": false,
"info": "Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "data_template",
@ -403,7 +444,9 @@
"display_name": "Text",
"dynamic": false,
"info": "Message to be passed as output.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -424,7 +467,10 @@
"dynamic": false,
"info": "Type of sender.",
"name": "sender",
"options": ["Machine", "User"],
"options": [
"Machine",
"User"
],
"placeholder": "",
"required": false,
"show": true,
@ -439,7 +485,9 @@
"display_name": "Sender Name",
"dynamic": false,
"info": "Name of the sender.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "sender_name",
@ -458,7 +506,9 @@
"display_name": "Session ID",
"dynamic": false,
"info": "The session ID of the chat. If empty, the current session ID parameter will be used.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "session_id",
@ -512,7 +562,10 @@
"display_name": "OpenAI",
"id": "OpenAIModel-1ioeW",
"node": {
"base_classes": ["LanguageModel", "Message"],
"base_classes": [
"LanguageModel",
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -545,9 +598,15 @@
"display_name": "Text",
"method": "text_response",
"name": "text_output",
"required_inputs": ["input_value", "stream", "system_message"],
"required_inputs": [
"input_value",
"stream",
"system_message"
],
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
},
{
@ -567,7 +626,9 @@
"temperature"
],
"selected": "LanguageModel",
"types": ["LanguageModel"],
"types": [
"LanguageModel"
],
"value": "__UNDEFINED__"
}
],
@ -580,7 +641,9 @@
"display_name": "OpenAI API Key",
"dynamic": false,
"info": "The OpenAI API Key to use for the OpenAI model.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"load_from_db": true,
"name": "api_key",
"password": true,
@ -615,7 +678,9 @@
"display_name": "Input",
"dynamic": false,
"info": "",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -731,7 +796,9 @@
"display_name": "Output Parser",
"dynamic": false,
"info": "The parser to use to parse the output of the model",
"input_types": ["OutputParser"],
"input_types": [
"OutputParser"
],
"list": false,
"name": "output_parser",
"placeholder": "",
@ -796,7 +863,9 @@
"display_name": "System Message",
"dynamic": false,
"info": "System message to pass to the model.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "system_message",
@ -848,7 +917,10 @@
"data": {
"id": "ToolCallingAgent-mf0BN",
"node": {
"base_classes": ["AgentExecutor", "Message"],
"base_classes": [
"AgentExecutor",
"Message"
],
"beta": true,
"conditional_paths": [],
"custom_fields": {},
@ -879,7 +951,9 @@
"method": "build_agent",
"name": "agent",
"selected": "AgentExecutor",
"types": ["AgentExecutor"],
"types": [
"AgentExecutor"
],
"value": "__UNDEFINED__"
},
{
@ -888,7 +962,9 @@
"method": "message_response",
"name": "response",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -901,7 +977,9 @@
"display_name": "Chat History",
"dynamic": false,
"info": "",
"input_types": ["Data"],
"input_types": [
"Data"
],
"list": true,
"name": "chat_history",
"placeholder": "",
@ -953,7 +1031,9 @@
"display_name": "Input",
"dynamic": false,
"info": "",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -972,7 +1052,9 @@
"display_name": "Language Model",
"dynamic": false,
"info": "",
"input_types": ["LanguageModel"],
"input_types": [
"LanguageModel"
],
"list": false,
"name": "llm",
"placeholder": "",
@ -1005,7 +1087,9 @@
"display_name": "System Prompt",
"dynamic": false,
"info": "System prompt for the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1025,7 +1109,10 @@
"display_name": "Tools",
"dynamic": false,
"info": "",
"input_types": ["Tool", "BaseTool"],
"input_types": [
"Tool",
"BaseTool"
],
"list": true,
"load_from_db": false,
"name": "tools",
@ -1043,7 +1130,9 @@
"display_name": "Prompt",
"dynamic": false,
"info": "This prompt must contain 'input' key.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1096,7 +1185,12 @@
"data": {
"id": "CalculatorTool-Nb4P5",
"node": {
"base_classes": ["Data", "list", "Sequence", "Tool"],
"base_classes": [
"Data",
"list",
"Sequence",
"Tool"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -1104,7 +1198,9 @@
"display_name": "Calculator",
"documentation": "",
"edited": false,
"field_order": ["expression"],
"field_order": [
"expression"
],
"frozen": false,
"icon": "calculator",
"lf_version": "1.0.16",
@ -1117,9 +1213,13 @@
"display_name": "Data",
"method": "run_model",
"name": "api_run_model",
"required_inputs": ["expression"],
"required_inputs": [
"expression"
],
"selected": "Data",
"types": ["Data"],
"types": [
"Data"
],
"value": "__UNDEFINED__"
},
{
@ -1127,9 +1227,13 @@
"display_name": "Tool",
"method": "build_tool",
"name": "api_build_tool",
"required_inputs": ["expression"],
"required_inputs": [
"expression"
],
"selected": "Tool",
"types": ["Tool"],
"types": [
"Tool"
],
"value": "__UNDEFINED__"
}
],
@ -1160,7 +1264,9 @@
"display_name": "Expression",
"dynamic": false,
"info": "The arithmetic expression to evaluate (e.g., '4*4*(33/22)+12-20').",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "expression",
@ -1198,7 +1304,10 @@
"display_name": "Python REPL Tool",
"id": "PythonREPLTool-i922a",
"node": {
"base_classes": ["Data", "Tool"],
"base_classes": [
"Data",
"Tool"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -1206,7 +1315,12 @@
"display_name": "Python REPL Tool",
"documentation": "",
"edited": false,
"field_order": ["name", "description", "global_imports", "code"],
"field_order": [
"name",
"description",
"global_imports",
"code"
],
"frozen": false,
"metadata": {},
"output_types": [],
@ -1223,7 +1337,9 @@
"name"
],
"selected": "Data",
"types": ["Data"],
"types": [
"Data"
],
"value": "__UNDEFINED__"
},
{
@ -1238,7 +1354,9 @@
"name"
],
"selected": "Tool",
"types": ["Tool"],
"types": [
"Tool"
],
"value": "__UNDEFINED__"
}
],

View file

@ -8,12 +8,17 @@
"dataType": "ChatInput",
"id": "ChatInput-AwB1F",
"name": "message",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "user_input",
"id": "Prompt-bHLxK",
"inputTypes": ["Message", "Text"],
"inputTypes": [
"Message",
"Text"
],
"type": "str"
}
},
@ -30,12 +35,16 @@
"dataType": "Prompt",
"id": "Prompt-bHLxK",
"name": "prompt",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "input_value",
"id": "OpenAIModel-tnzXU",
"inputTypes": ["Message"],
"inputTypes": [
"Message"
],
"type": "str"
}
},
@ -52,12 +61,16 @@
"dataType": "OpenAIModel",
"id": "OpenAIModel-tnzXU",
"name": "text_output",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "input_value",
"id": "ChatOutput-wbcyd",
"inputTypes": ["Message"],
"inputTypes": [
"Message"
],
"type": "str"
}
},
@ -75,7 +88,9 @@
"display_name": "Chat Input",
"id": "ChatInput-AwB1F",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -102,7 +117,9 @@
"method": "message_response",
"name": "message",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -125,7 +142,7 @@
"show": true,
"title_case": false,
"type": "code",
"value": "from langflow.base.data.utils import IMG_FILE_TYPES, TEXT_FILE_TYPES\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, FileInput, MessageTextInput, MultilineInput, Output\nfrom langflow.memory import store_message\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_USER, MESSAGE_SENDER_USER\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n name = \"ChatInput\"\n\n inputs = [\n MultilineInput(\n name=\"input_value\",\n display_name=\"Text\",\n value=\"\",\n info=\"Message to be passed as input.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_USER,\n info=\"Type of sender.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_USER,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n FileInput(\n name=\"files\",\n display_name=\"Files\",\n file_types=TEXT_FILE_TYPES + IMG_FILE_TYPES,\n info=\"Files to be sent with the message.\",\n advanced=True,\n is_list=True,\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n files=self.files,\n )\n\n if (\n self.session_id\n and isinstance(message, Message)\n and isinstance(message.text, str)\n and self.should_store_message\n ):\n store_message(\n message,\n flow_id=self.graph.flow_id,\n )\n self.message.value = message\n\n self.status = message\n return message\n"
"value": "from langflow.base.data.utils import IMG_FILE_TYPES, TEXT_FILE_TYPES\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, FileInput, MessageTextInput, MultilineInput, Output\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_USER, MESSAGE_SENDER_USER\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n name = \"ChatInput\"\n\n inputs = [\n MultilineInput(\n name=\"input_value\",\n display_name=\"Text\",\n value=\"\",\n info=\"Message to be passed as input.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_USER,\n info=\"Type of sender.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_USER,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n FileInput(\n name=\"files\",\n display_name=\"Files\",\n file_types=TEXT_FILE_TYPES + IMG_FILE_TYPES,\n info=\"Files to be sent with the message.\",\n advanced=True,\n is_list=True,\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n files=self.files,\n )\n if self.session_id and isinstance(message, Message) and self.should_store_message:\n stored_message = self.store_message(\n message,\n )\n self.message.value = stored_message\n message = stored_message\n\n self.status = message\n return message\n"
},
"files": {
"advanced": true,
@ -173,7 +190,9 @@
"display_name": "Text",
"dynamic": false,
"info": "Message to be passed as input.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -193,7 +212,10 @@
"dynamic": false,
"info": "Type of sender.",
"name": "sender",
"options": ["Machine", "User"],
"options": [
"Machine",
"User"
],
"placeholder": "",
"required": false,
"show": true,
@ -207,7 +229,9 @@
"display_name": "Sender Name",
"dynamic": false,
"info": "Name of the sender.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "sender_name",
@ -225,7 +249,9 @@
"display_name": "Session ID",
"dynamic": false,
"info": "The session ID of the chat. If empty, the current session ID parameter will be used.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "session_id",
@ -279,17 +305,23 @@
"display_name": "Prompt",
"id": "Prompt-bHLxK",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {
"template": ["user_input"]
"template": [
"user_input"
]
},
"description": "Create a prompt template with dynamic variables.",
"display_name": "Prompt",
"documentation": "",
"edited": false,
"field_order": ["template"],
"field_order": [
"template"
],
"frozen": false,
"icon": "prompts",
"metadata": {},
@ -301,7 +333,9 @@
"method": "build_prompt",
"name": "prompt",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -350,7 +384,10 @@
"fileTypes": [],
"file_path": "",
"info": "",
"input_types": ["Message", "Text"],
"input_types": [
"Message",
"Text"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -388,7 +425,9 @@
"display_name": "Chat Output",
"id": "ChatOutput-wbcyd",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -415,7 +454,9 @@
"method": "message_response",
"name": "message",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -438,14 +479,16 @@
"show": true,
"title_case": false,
"type": "code",
"value": "from langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, MessageTextInput, Output\nfrom langflow.memory import store_message\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_AI, MESSAGE_SENDER_USER\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n name = \"ChatOutput\"\n\n inputs = [\n MessageTextInput(\n name=\"input_value\",\n display_name=\"Text\",\n info=\"Message to be passed as output.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_AI,\n advanced=True,\n info=\"Type of sender.\",\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_AI,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"data_template\",\n display_name=\"Data Template\",\n value=\"{text}\",\n advanced=True,\n info=\"Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.\",\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n )\n if (\n self.session_id\n and isinstance(message, Message)\n and isinstance(message.text, str)\n and self.should_store_message\n ):\n store_message(\n message,\n flow_id=self.graph.flow_id,\n )\n self.message.value = message\n\n self.status = message\n return message\n"
"value": "from langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, MessageTextInput, Output\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_AI, MESSAGE_SENDER_USER\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n name = \"ChatOutput\"\n\n inputs = [\n MessageTextInput(\n name=\"input_value\",\n display_name=\"Text\",\n info=\"Message to be passed as output.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_AI,\n advanced=True,\n info=\"Type of sender.\",\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_AI,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"data_template\",\n display_name=\"Data Template\",\n value=\"{text}\",\n advanced=True,\n info=\"Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.\",\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n )\n if self.session_id and isinstance(message, Message) and self.should_store_message:\n stored_message = self.store_message(\n message,\n )\n self.message.value = stored_message\n message = stored_message\n\n self.status = message\n return message\n"
},
"data_template": {
"advanced": true,
"display_name": "Data Template",
"dynamic": false,
"info": "Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "data_template",
@ -463,7 +506,9 @@
"display_name": "Text",
"dynamic": false,
"info": "Message to be passed as output.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -482,7 +527,10 @@
"dynamic": false,
"info": "Type of sender.",
"name": "sender",
"options": ["Machine", "User"],
"options": [
"Machine",
"User"
],
"placeholder": "",
"required": false,
"show": true,
@ -496,7 +544,9 @@
"display_name": "Sender Name",
"dynamic": false,
"info": "Name of the sender.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "sender_name",
@ -514,7 +564,9 @@
"display_name": "Session ID",
"dynamic": false,
"info": "The session ID of the chat. If empty, the current session ID parameter will be used.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "session_id",
@ -568,7 +620,10 @@
"display_name": "OpenAI",
"id": "OpenAIModel-tnzXU",
"node": {
"base_classes": ["LanguageModel", "Message"],
"base_classes": [
"LanguageModel",
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -600,9 +655,15 @@
"display_name": "Text",
"method": "text_response",
"name": "text_output",
"required_inputs": ["input_value", "stream", "system_message"],
"required_inputs": [
"input_value",
"stream",
"system_message"
],
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
},
{
@ -622,7 +683,9 @@
"temperature"
],
"selected": "LanguageModel",
"types": ["LanguageModel"],
"types": [
"LanguageModel"
],
"value": "__UNDEFINED__"
}
],
@ -635,7 +698,9 @@
"display_name": "OpenAI API Key",
"dynamic": false,
"info": "The OpenAI API Key to use for the OpenAI model.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"load_from_db": true,
"name": "api_key",
"password": true,
@ -669,7 +734,9 @@
"display_name": "Input",
"dynamic": false,
"info": "",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -772,7 +839,9 @@
"display_name": "Output Parser",
"dynamic": false,
"info": "The parser to use to parse the output of the model",
"input_types": ["OutputParser"],
"input_types": [
"OutputParser"
],
"list": false,
"name": "output_parser",
"placeholder": "",

View file

@ -8,12 +8,16 @@
"dataType": "URL",
"id": "URL-46k0m",
"name": "data",
"output_types": ["Data"]
"output_types": [
"Data"
]
},
"targetHandle": {
"fieldName": "data",
"id": "ParseData-jUQRS",
"inputTypes": ["Data"],
"inputTypes": [
"Data"
],
"type": "other"
}
},
@ -30,12 +34,17 @@
"dataType": "ParseData",
"id": "ParseData-jUQRS",
"name": "text",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "references",
"id": "Prompt-Pf4QQ",
"inputTypes": ["Message", "Text"],
"inputTypes": [
"Message",
"Text"
],
"type": "str"
}
},
@ -52,12 +61,17 @@
"dataType": "TextInput",
"id": "TextInput-slCbp",
"name": "text",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "instructions",
"id": "Prompt-Pf4QQ",
"inputTypes": ["Message", "Text"],
"inputTypes": [
"Message",
"Text"
],
"type": "str"
}
},
@ -74,12 +88,16 @@
"dataType": "Prompt",
"id": "Prompt-Pf4QQ",
"name": "prompt",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "input_value",
"id": "OpenAIModel-o0Gr0",
"inputTypes": ["Message"],
"inputTypes": [
"Message"
],
"type": "str"
}
},
@ -96,12 +114,16 @@
"dataType": "OpenAIModel",
"id": "OpenAIModel-o0Gr0",
"name": "text_output",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "input_value",
"id": "ChatOutput-eIVde",
"inputTypes": ["Message"],
"inputTypes": [
"Message"
],
"type": "str"
}
},
@ -119,7 +141,9 @@
"display_name": "URL",
"id": "URL-46k0m",
"node": {
"base_classes": ["Data"],
"base_classes": [
"Data"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -127,7 +151,9 @@
"display_name": "URL",
"documentation": "",
"edited": false,
"field_order": ["urls"],
"field_order": [
"urls"
],
"frozen": false,
"icon": "layout-template",
"metadata": {},
@ -139,7 +165,9 @@
"method": "fetch_content",
"name": "data",
"selected": "Data",
"types": ["Data"],
"types": [
"Data"
],
"value": "__UNDEFINED__"
},
{
@ -148,7 +176,9 @@
"method": "fetch_content_text",
"name": "text",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -181,7 +211,10 @@
"dynamic": false,
"info": "Output format. Use 'Text' to extract the text from the HTML or 'Raw HTML' for the raw HTML content.",
"name": "format",
"options": ["Text", "Raw HTML"],
"options": [
"Text",
"Raw HTML"
],
"placeholder": "",
"required": false,
"show": true,
@ -195,7 +228,9 @@
"display_name": "URLs",
"dynamic": false,
"info": "Enter one or more URLs, by clicking the '+' button.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": true,
"load_from_db": false,
"name": "urls",
@ -206,7 +241,10 @@
"trace_as_input": true,
"trace_as_metadata": true,
"type": "str",
"value": ["langflow.org/", "docs.langflow.org/"]
"value": [
"langflow.org/",
"docs.langflow.org/"
]
}
}
},
@ -233,7 +271,9 @@
"display_name": "Parse Data",
"id": "ParseData-jUQRS",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -241,7 +281,11 @@
"display_name": "Parse Data",
"documentation": "",
"edited": false,
"field_order": ["data", "template", "sep"],
"field_order": [
"data",
"template",
"sep"
],
"frozen": false,
"icon": "braces",
"metadata": {},
@ -253,7 +297,9 @@
"method": "parse_data",
"name": "text",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -283,7 +329,9 @@
"display_name": "Data",
"dynamic": false,
"info": "The data to convert to text.",
"input_types": ["Data"],
"input_types": [
"Data"
],
"list": false,
"name": "data",
"placeholder": "",
@ -316,7 +364,9 @@
"display_name": "Template",
"dynamic": false,
"info": "The template to use for formatting the data. It can contain the keys {text}, {data} or any other key in the Data.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -355,17 +405,24 @@
"display_name": "Prompt",
"id": "Prompt-Pf4QQ",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {
"template": ["references", "instructions"]
"template": [
"references",
"instructions"
]
},
"description": "Create a prompt template with dynamic variables.",
"display_name": "Prompt",
"documentation": "",
"edited": false,
"field_order": ["template"],
"field_order": [
"template"
],
"frozen": false,
"icon": "prompts",
"metadata": {},
@ -377,7 +434,9 @@
"method": "build_prompt",
"name": "prompt",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -410,7 +469,10 @@
"fileTypes": [],
"file_path": "",
"info": "",
"input_types": ["Message", "Text"],
"input_types": [
"Message",
"Text"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -431,7 +493,10 @@
"fileTypes": [],
"file_path": "",
"info": "",
"input_types": ["Message", "Text"],
"input_types": [
"Message",
"Text"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -485,7 +550,9 @@
"display_name": "Instructions",
"id": "TextInput-slCbp",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -493,7 +560,9 @@
"display_name": "Instructions",
"documentation": "",
"edited": false,
"field_order": ["input_value"],
"field_order": [
"input_value"
],
"frozen": false,
"icon": "type",
"output_types": [],
@ -504,7 +573,9 @@
"method": "text_response",
"name": "text",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -535,7 +606,9 @@
"display_name": "Text",
"dynamic": false,
"info": "Text to be passed as input.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -574,7 +647,9 @@
"display_name": "Chat Output",
"id": "ChatOutput-eIVde",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -601,7 +676,9 @@
"method": "message_response",
"name": "message",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -624,14 +701,16 @@
"show": true,
"title_case": false,
"type": "code",
"value": "from langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, MessageTextInput, Output\nfrom langflow.memory import store_message\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_AI, MESSAGE_SENDER_USER\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n name = \"ChatOutput\"\n\n inputs = [\n MessageTextInput(\n name=\"input_value\",\n display_name=\"Text\",\n info=\"Message to be passed as output.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_AI,\n advanced=True,\n info=\"Type of sender.\",\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_AI,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"data_template\",\n display_name=\"Data Template\",\n value=\"{text}\",\n advanced=True,\n info=\"Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.\",\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n )\n if (\n self.session_id\n and isinstance(message, Message)\n and isinstance(message.text, str)\n and self.should_store_message\n ):\n store_message(\n message,\n flow_id=self.graph.flow_id,\n )\n self.message.value = message\n\n self.status = message\n return message\n"
"value": "from langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, MessageTextInput, Output\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_AI, MESSAGE_SENDER_USER\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n name = \"ChatOutput\"\n\n inputs = [\n MessageTextInput(\n name=\"input_value\",\n display_name=\"Text\",\n info=\"Message to be passed as output.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_AI,\n advanced=True,\n info=\"Type of sender.\",\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_AI,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"data_template\",\n display_name=\"Data Template\",\n value=\"{text}\",\n advanced=True,\n info=\"Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.\",\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n )\n if self.session_id and isinstance(message, Message) and self.should_store_message:\n stored_message = self.store_message(\n message,\n )\n self.message.value = stored_message\n message = stored_message\n\n self.status = message\n return message\n"
},
"data_template": {
"advanced": true,
"display_name": "Data Template",
"dynamic": false,
"info": "Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "data_template",
@ -649,7 +728,9 @@
"display_name": "Text",
"dynamic": false,
"info": "Message to be passed as output.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -668,7 +749,10 @@
"dynamic": false,
"info": "Type of sender.",
"name": "sender",
"options": ["Machine", "User"],
"options": [
"Machine",
"User"
],
"placeholder": "",
"required": false,
"show": true,
@ -682,7 +766,9 @@
"display_name": "Sender Name",
"dynamic": false,
"info": "Name of the sender.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "sender_name",
@ -700,7 +786,9 @@
"display_name": "Session ID",
"dynamic": false,
"info": "The session ID of the chat. If empty, the current session ID parameter will be used.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "session_id",
@ -754,7 +842,10 @@
"display_name": "OpenAI",
"id": "OpenAIModel-o0Gr0",
"node": {
"base_classes": ["LanguageModel", "Message"],
"base_classes": [
"LanguageModel",
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -786,9 +877,15 @@
"display_name": "Text",
"method": "text_response",
"name": "text_output",
"required_inputs": ["input_value", "stream", "system_message"],
"required_inputs": [
"input_value",
"stream",
"system_message"
],
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
},
{
@ -808,7 +905,9 @@
"temperature"
],
"selected": "LanguageModel",
"types": ["LanguageModel"],
"types": [
"LanguageModel"
],
"value": "__UNDEFINED__"
}
],
@ -821,7 +920,9 @@
"display_name": "OpenAI API Key",
"dynamic": false,
"info": "The OpenAI API Key to use for the OpenAI model.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"load_from_db": true,
"name": "api_key",
"password": true,
@ -855,7 +956,9 @@
"display_name": "Input",
"dynamic": false,
"info": "",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -958,7 +1061,9 @@
"display_name": "Output Parser",
"dynamic": false,
"info": "The parser to use to parse the output of the model",
"input_types": ["OutputParser"],
"input_types": [
"OutputParser"
],
"list": false,
"name": "output_parser",
"placeholder": "",

View file

@ -8,12 +8,17 @@
"dataType": "ChatInput",
"id": "ChatInput-Emi4q",
"name": "message",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "Question",
"id": "Prompt-n8yRL",
"inputTypes": ["Message", "Text"],
"inputTypes": [
"Message",
"Text"
],
"type": "str"
}
},
@ -30,12 +35,16 @@
"dataType": "Prompt",
"id": "Prompt-n8yRL",
"name": "prompt",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "input_value",
"id": "OpenAIModel-1hwZ2",
"inputTypes": ["Message"],
"inputTypes": [
"Message"
],
"type": "str"
}
},
@ -52,12 +61,16 @@
"dataType": "OpenAIModel",
"id": "OpenAIModel-1hwZ2",
"name": "text_output",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "input_value",
"id": "ChatOutput-sD0lp",
"inputTypes": ["Message"],
"inputTypes": [
"Message"
],
"type": "str"
}
},
@ -74,12 +87,17 @@
"dataType": "ParseData",
"id": "ParseData-qYLes",
"name": "text",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "Document",
"id": "Prompt-n8yRL",
"inputTypes": ["Message", "Text"],
"inputTypes": [
"Message",
"Text"
],
"type": "str"
}
},
@ -96,12 +114,16 @@
"dataType": "File",
"id": "File-0oa6O",
"name": "data",
"output_types": ["Data"]
"output_types": [
"Data"
]
},
"targetHandle": {
"fieldName": "data",
"id": "ParseData-qYLes",
"inputTypes": ["Data"],
"inputTypes": [
"Data"
],
"type": "other"
}
},
@ -119,17 +141,24 @@
"display_name": "Prompt",
"id": "Prompt-n8yRL",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {
"template": ["Document", "Question"]
"template": [
"Document",
"Question"
]
},
"description": "Create a prompt template with dynamic variables.",
"display_name": "Prompt",
"documentation": "",
"edited": false,
"field_order": ["template"],
"field_order": [
"template"
],
"frozen": false,
"icon": "prompts",
"metadata": {},
@ -141,7 +170,9 @@
"method": "build_prompt",
"name": "prompt",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -155,7 +186,10 @@
"fileTypes": [],
"file_path": "",
"info": "",
"input_types": ["Message", "Text"],
"input_types": [
"Message",
"Text"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -176,7 +210,10 @@
"fileTypes": [],
"file_path": "",
"info": "",
"input_types": ["Message", "Text"],
"input_types": [
"Message",
"Text"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -249,7 +286,9 @@
"display_name": "Chat Input",
"id": "ChatInput-Emi4q",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -276,7 +315,9 @@
"method": "message_response",
"name": "message",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -299,7 +340,7 @@
"show": true,
"title_case": false,
"type": "code",
"value": "from langflow.base.data.utils import IMG_FILE_TYPES, TEXT_FILE_TYPES\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, FileInput, MessageTextInput, MultilineInput, Output\nfrom langflow.memory import store_message\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_USER, MESSAGE_SENDER_USER\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n name = \"ChatInput\"\n\n inputs = [\n MultilineInput(\n name=\"input_value\",\n display_name=\"Text\",\n value=\"\",\n info=\"Message to be passed as input.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_USER,\n info=\"Type of sender.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_USER,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n FileInput(\n name=\"files\",\n display_name=\"Files\",\n file_types=TEXT_FILE_TYPES + IMG_FILE_TYPES,\n info=\"Files to be sent with the message.\",\n advanced=True,\n is_list=True,\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n files=self.files,\n )\n\n if (\n self.session_id\n and isinstance(message, Message)\n and isinstance(message.text, str)\n and self.should_store_message\n ):\n store_message(\n message,\n flow_id=self.graph.flow_id,\n )\n self.message.value = message\n\n self.status = message\n return message\n"
"value": "from langflow.base.data.utils import IMG_FILE_TYPES, TEXT_FILE_TYPES\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, FileInput, MessageTextInput, MultilineInput, Output\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_USER, MESSAGE_SENDER_USER\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n name = \"ChatInput\"\n\n inputs = [\n MultilineInput(\n name=\"input_value\",\n display_name=\"Text\",\n value=\"\",\n info=\"Message to be passed as input.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_USER,\n info=\"Type of sender.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_USER,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n FileInput(\n name=\"files\",\n display_name=\"Files\",\n file_types=TEXT_FILE_TYPES + IMG_FILE_TYPES,\n info=\"Files to be sent with the message.\",\n advanced=True,\n is_list=True,\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n files=self.files,\n )\n if self.session_id and isinstance(message, Message) and self.should_store_message:\n stored_message = self.store_message(\n message,\n )\n self.message.value = stored_message\n message = stored_message\n\n self.status = message\n return message\n"
},
"files": {
"advanced": true,
@ -347,7 +388,9 @@
"display_name": "Text",
"dynamic": false,
"info": "Message to be passed as input.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -367,7 +410,10 @@
"dynamic": false,
"info": "Type of sender.",
"name": "sender",
"options": ["Machine", "User"],
"options": [
"Machine",
"User"
],
"placeholder": "",
"required": false,
"show": true,
@ -381,7 +427,9 @@
"display_name": "Sender Name",
"dynamic": false,
"info": "Name of the sender.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "sender_name",
@ -399,7 +447,9 @@
"display_name": "Session ID",
"dynamic": false,
"info": "The session ID of the chat. If empty, the current session ID parameter will be used.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "session_id",
@ -453,7 +503,9 @@
"display_name": "Chat Output",
"id": "ChatOutput-sD0lp",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -480,7 +532,9 @@
"method": "message_response",
"name": "message",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -503,14 +557,16 @@
"show": true,
"title_case": false,
"type": "code",
"value": "from langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, MessageTextInput, Output\nfrom langflow.memory import store_message\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_AI, MESSAGE_SENDER_USER\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n name = \"ChatOutput\"\n\n inputs = [\n MessageTextInput(\n name=\"input_value\",\n display_name=\"Text\",\n info=\"Message to be passed as output.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_AI,\n advanced=True,\n info=\"Type of sender.\",\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_AI,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"data_template\",\n display_name=\"Data Template\",\n value=\"{text}\",\n advanced=True,\n info=\"Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.\",\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n )\n if (\n self.session_id\n and isinstance(message, Message)\n and isinstance(message.text, str)\n and self.should_store_message\n ):\n store_message(\n message,\n flow_id=self.graph.flow_id,\n )\n self.message.value = message\n\n self.status = message\n return message\n"
"value": "from langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, MessageTextInput, Output\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_AI, MESSAGE_SENDER_USER\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n name = \"ChatOutput\"\n\n inputs = [\n MessageTextInput(\n name=\"input_value\",\n display_name=\"Text\",\n info=\"Message to be passed as output.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_AI,\n advanced=True,\n info=\"Type of sender.\",\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_AI,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"data_template\",\n display_name=\"Data Template\",\n value=\"{text}\",\n advanced=True,\n info=\"Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.\",\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n )\n if self.session_id and isinstance(message, Message) and self.should_store_message:\n stored_message = self.store_message(\n message,\n )\n self.message.value = stored_message\n message = stored_message\n\n self.status = message\n return message\n"
},
"data_template": {
"advanced": true,
"display_name": "Data Template",
"dynamic": false,
"info": "Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "data_template",
@ -528,7 +584,9 @@
"display_name": "Text",
"dynamic": false,
"info": "Message to be passed as output.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -547,7 +605,10 @@
"dynamic": false,
"info": "Type of sender.",
"name": "sender",
"options": ["Machine", "User"],
"options": [
"Machine",
"User"
],
"placeholder": "",
"required": false,
"show": true,
@ -561,7 +622,9 @@
"display_name": "Sender Name",
"dynamic": false,
"info": "Name of the sender.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "sender_name",
@ -579,7 +642,9 @@
"display_name": "Session ID",
"dynamic": false,
"info": "The session ID of the chat. If empty, the current session ID parameter will be used.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "session_id",
@ -633,7 +698,10 @@
"display_name": "OpenAI",
"id": "OpenAIModel-1hwZ2",
"node": {
"base_classes": ["LanguageModel", "Message"],
"base_classes": [
"LanguageModel",
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -665,9 +733,15 @@
"display_name": "Text",
"method": "text_response",
"name": "text_output",
"required_inputs": ["input_value", "stream", "system_message"],
"required_inputs": [
"input_value",
"stream",
"system_message"
],
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
},
{
@ -687,7 +761,9 @@
"temperature"
],
"selected": "LanguageModel",
"types": ["LanguageModel"],
"types": [
"LanguageModel"
],
"value": "__UNDEFINED__"
}
],
@ -700,7 +776,9 @@
"display_name": "OpenAI API Key",
"dynamic": false,
"info": "The OpenAI API Key to use for the OpenAI model.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"load_from_db": true,
"name": "api_key",
"password": true,
@ -734,7 +812,9 @@
"display_name": "Input",
"dynamic": false,
"info": "",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -837,7 +917,9 @@
"display_name": "Output Parser",
"dynamic": false,
"info": "The parser to use to parse the output of the model",
"input_types": ["OutputParser"],
"input_types": [
"OutputParser"
],
"list": false,
"name": "output_parser",
"placeholder": "",
@ -949,7 +1031,9 @@
"display_name": "Parse Data",
"id": "ParseData-qYLes",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -957,7 +1041,11 @@
"display_name": "Parse Data",
"documentation": "",
"edited": false,
"field_order": ["data", "template", "sep"],
"field_order": [
"data",
"template",
"sep"
],
"frozen": false,
"icon": "braces",
"metadata": {},
@ -969,7 +1057,9 @@
"method": "parse_data",
"name": "text",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -999,7 +1089,9 @@
"display_name": "Data",
"dynamic": false,
"info": "The data to convert to text.",
"input_types": ["Data"],
"input_types": [
"Data"
],
"list": false,
"name": "data",
"placeholder": "",
@ -1032,7 +1124,9 @@
"display_name": "Template",
"dynamic": false,
"info": "The template to use for formatting the data. It can contain the keys {text}, {data} or any other key in the Data.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1071,7 +1165,9 @@
"display_name": "File",
"id": "File-0oa6O",
"node": {
"base_classes": ["Data"],
"base_classes": [
"Data"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -1079,7 +1175,10 @@
"display_name": "File",
"documentation": "",
"edited": false,
"field_order": ["path", "silent_errors"],
"field_order": [
"path",
"silent_errors"
],
"frozen": false,
"icon": "file-text",
"metadata": {},
@ -1091,7 +1190,9 @@
"method": "load_file",
"name": "data",
"selected": "Data",
"types": ["Data"],
"types": [
"Data"
],
"value": "__UNDEFINED__"
}
],

View file

@ -8,12 +8,16 @@
"dataType": "HierarchicalCrewComponent",
"id": "HierarchicalCrewComponent-Y0Uvf",
"name": "output",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "input_value",
"id": "ChatOutput-VzVJK",
"inputTypes": ["Message"],
"inputTypes": [
"Message"
],
"type": "str"
}
},
@ -31,12 +35,16 @@
"dataType": "HierarchicalTaskComponent",
"id": "HierarchicalTaskComponent-hE8H5",
"name": "task_output",
"output_types": ["HierarchicalTask"]
"output_types": [
"HierarchicalTask"
]
},
"targetHandle": {
"fieldName": "tasks",
"id": "HierarchicalCrewComponent-Y0Uvf",
"inputTypes": ["HierarchicalTask"],
"inputTypes": [
"HierarchicalTask"
],
"type": "other"
}
},
@ -54,12 +62,16 @@
"dataType": "CrewAIAgentComponent",
"id": "CrewAIAgentComponent-EbpXd",
"name": "output",
"output_types": ["Agent"]
"output_types": [
"Agent"
]
},
"targetHandle": {
"fieldName": "agents",
"id": "HierarchicalCrewComponent-Y0Uvf",
"inputTypes": ["Agent"],
"inputTypes": [
"Agent"
],
"type": "other"
}
},
@ -77,12 +89,16 @@
"dataType": "OpenAIModel",
"id": "OpenAIModel-Yjtpu",
"name": "model_output",
"output_types": ["LanguageModel"]
"output_types": [
"LanguageModel"
]
},
"targetHandle": {
"fieldName": "llm",
"id": "CrewAIAgentComponent-EbpXd",
"inputTypes": ["LanguageModel"],
"inputTypes": [
"LanguageModel"
],
"type": "other"
}
},
@ -100,12 +116,16 @@
"dataType": "CrewAIAgentComponent",
"id": "CrewAIAgentComponent-9D8ao",
"name": "output",
"output_types": ["Agent"]
"output_types": [
"Agent"
]
},
"targetHandle": {
"fieldName": "manager_agent",
"id": "HierarchicalCrewComponent-Y0Uvf",
"inputTypes": ["Agent"],
"inputTypes": [
"Agent"
],
"type": "other"
}
},
@ -123,12 +143,16 @@
"dataType": "OpenAIModel",
"id": "OpenAIModel-HgNnu",
"name": "model_output",
"output_types": ["LanguageModel"]
"output_types": [
"LanguageModel"
]
},
"targetHandle": {
"fieldName": "llm",
"id": "CrewAIAgentComponent-9D8ao",
"inputTypes": ["LanguageModel"],
"inputTypes": [
"LanguageModel"
],
"type": "other"
}
},
@ -146,12 +170,16 @@
"dataType": "Prompt",
"id": "Prompt-eqGhn",
"name": "prompt",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "task_description",
"id": "HierarchicalTaskComponent-hE8H5",
"inputTypes": ["Message"],
"inputTypes": [
"Message"
],
"type": "str"
}
},
@ -168,12 +196,17 @@
"dataType": "ChatInput",
"id": "ChatInput-xgRl9",
"name": "message",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "query",
"id": "Prompt-eqGhn",
"inputTypes": ["Message", "Text"],
"inputTypes": [
"Message",
"Text"
],
"type": "str"
}
},
@ -190,12 +223,16 @@
"dataType": "CrewAIAgentComponent",
"id": "CrewAIAgentComponent-UMpxO",
"name": "output",
"output_types": ["Agent"]
"output_types": [
"Agent"
]
},
"targetHandle": {
"fieldName": "agents",
"id": "HierarchicalCrewComponent-Y0Uvf",
"inputTypes": ["Agent"],
"inputTypes": [
"Agent"
],
"type": "other"
}
},
@ -212,12 +249,16 @@
"dataType": "OpenAIModel",
"id": "OpenAIModel-Yjtpu",
"name": "model_output",
"output_types": ["LanguageModel"]
"output_types": [
"LanguageModel"
]
},
"targetHandle": {
"fieldName": "llm",
"id": "CrewAIAgentComponent-UMpxO",
"inputTypes": ["LanguageModel"],
"inputTypes": [
"LanguageModel"
],
"type": "other"
}
},
@ -234,12 +275,16 @@
"dataType": "SearchAPI",
"id": "SearchAPI-Yokat",
"name": "api_build_tool",
"output_types": ["Tool"]
"output_types": [
"Tool"
]
},
"targetHandle": {
"fieldName": "tools",
"id": "CrewAIAgentComponent-EbpXd",
"inputTypes": ["Tool"],
"inputTypes": [
"Tool"
],
"type": "other"
}
},
@ -257,7 +302,9 @@
"display_name": "Hierarchical Crew",
"id": "HierarchicalCrewComponent-Y0Uvf",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -289,7 +336,9 @@
"name": "output",
"required_inputs": [],
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -301,7 +350,9 @@
"display_name": "Agents",
"dynamic": false,
"info": "",
"input_types": ["Agent"],
"input_types": [
"Agent"
],
"list": true,
"name": "agents",
"placeholder": "",
@ -335,7 +386,9 @@
"display_name": "Function Calling LLM",
"dynamic": false,
"info": "Turns the ReAct CrewAI agent into a function-calling agent",
"input_types": ["LanguageModel"],
"input_types": [
"LanguageModel"
],
"list": false,
"name": "function_calling_llm",
"placeholder": "",
@ -351,7 +404,9 @@
"display_name": "Manager Agent",
"dynamic": false,
"info": "",
"input_types": ["Agent"],
"input_types": [
"Agent"
],
"list": false,
"name": "manager_agent",
"placeholder": "",
@ -367,7 +422,9 @@
"display_name": "Manager LLM",
"dynamic": false,
"info": "",
"input_types": ["LanguageModel"],
"input_types": [
"LanguageModel"
],
"list": false,
"name": "manager_llm",
"placeholder": "",
@ -428,7 +485,9 @@
"display_name": "Tasks",
"dynamic": false,
"info": "",
"input_types": ["HierarchicalTask"],
"input_types": [
"HierarchicalTask"
],
"list": true,
"name": "tasks",
"placeholder": "",
@ -490,7 +549,10 @@
"display_name": "OpenAI",
"id": "OpenAIModel-Yjtpu",
"node": {
"base_classes": ["LanguageModel", "Message"],
"base_classes": [
"LanguageModel",
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -522,9 +584,15 @@
"display_name": "Text",
"method": "text_response",
"name": "text_output",
"required_inputs": ["input_value", "stream", "system_message"],
"required_inputs": [
"input_value",
"stream",
"system_message"
],
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
},
{
@ -544,7 +612,9 @@
"temperature"
],
"selected": "LanguageModel",
"types": ["LanguageModel"],
"types": [
"LanguageModel"
],
"value": "__UNDEFINED__"
}
],
@ -556,7 +626,9 @@
"display_name": "OpenAI API Key",
"dynamic": false,
"info": "The OpenAI API Key to use for the OpenAI model.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"load_from_db": true,
"name": "api_key",
"password": true,
@ -590,7 +662,9 @@
"display_name": "Input",
"dynamic": false,
"info": "",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -694,7 +768,9 @@
"display_name": "Output Parser",
"dynamic": false,
"info": "The parser to use to parse the output of the model",
"input_types": ["OutputParser"],
"input_types": [
"OutputParser"
],
"list": false,
"name": "output_parser",
"placeholder": "",
@ -755,7 +831,9 @@
"display_name": "System Message",
"dynamic": false,
"info": "System message to pass to the model.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "system_message",
@ -808,7 +886,9 @@
"display_name": "Chat Output",
"id": "ChatOutput-VzVJK",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -835,7 +915,9 @@
"method": "message_response",
"name": "message",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -858,14 +940,16 @@
"show": true,
"title_case": false,
"type": "code",
"value": "from langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, MessageTextInput, Output\nfrom langflow.memory import store_message\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_AI, MESSAGE_SENDER_USER\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n name = \"ChatOutput\"\n\n inputs = [\n MessageTextInput(\n name=\"input_value\",\n display_name=\"Text\",\n info=\"Message to be passed as output.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_AI,\n advanced=True,\n info=\"Type of sender.\",\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_AI,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"data_template\",\n display_name=\"Data Template\",\n value=\"{text}\",\n advanced=True,\n info=\"Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.\",\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n )\n if (\n self.session_id\n and isinstance(message, Message)\n and isinstance(message.text, str)\n and self.should_store_message\n ):\n store_message(\n message,\n flow_id=self.graph.flow_id,\n )\n self.message.value = message\n\n self.status = message\n return message\n"
"value": "from langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, MessageTextInput, Output\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_AI, MESSAGE_SENDER_USER\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n name = \"ChatOutput\"\n\n inputs = [\n MessageTextInput(\n name=\"input_value\",\n display_name=\"Text\",\n info=\"Message to be passed as output.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_AI,\n advanced=True,\n info=\"Type of sender.\",\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_AI,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"data_template\",\n display_name=\"Data Template\",\n value=\"{text}\",\n advanced=True,\n info=\"Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.\",\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n )\n if self.session_id and isinstance(message, Message) and self.should_store_message:\n stored_message = self.store_message(\n message,\n )\n self.message.value = stored_message\n message = stored_message\n\n self.status = message\n return message\n"
},
"data_template": {
"advanced": true,
"display_name": "Data Template",
"dynamic": false,
"info": "Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "data_template",
@ -883,7 +967,9 @@
"display_name": "Text",
"dynamic": false,
"info": "Message to be passed as output.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -902,7 +988,10 @@
"dynamic": false,
"info": "Type of sender.",
"name": "sender",
"options": ["Machine", "User"],
"options": [
"Machine",
"User"
],
"placeholder": "",
"required": false,
"show": true,
@ -916,7 +1005,9 @@
"display_name": "Sender Name",
"dynamic": false,
"info": "Name of the sender.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "sender_name",
@ -934,7 +1025,9 @@
"display_name": "Session ID",
"dynamic": false,
"info": "The session ID of the chat. If empty, the current session ID parameter will be used.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "session_id",
@ -987,7 +1080,9 @@
"display_name": "Hierarchical Task",
"id": "HierarchicalTaskComponent-hE8H5",
"node": {
"base_classes": ["HierarchicalTask"],
"base_classes": [
"HierarchicalTask"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -995,7 +1090,11 @@
"display_name": "Hierarchical Task",
"documentation": "",
"edited": false,
"field_order": ["task_description", "expected_output", "tools"],
"field_order": [
"task_description",
"expected_output",
"tools"
],
"frozen": false,
"icon": "CrewAI",
"metadata": {},
@ -1007,7 +1106,9 @@
"method": "build_task",
"name": "task_output",
"selected": "HierarchicalTask",
"types": ["HierarchicalTask"],
"types": [
"HierarchicalTask"
],
"value": "__UNDEFINED__"
}
],
@ -1037,7 +1138,9 @@
"display_name": "Expected Output",
"dynamic": false,
"info": "Clear definition of expected task outcome.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1056,7 +1159,9 @@
"display_name": "Description",
"dynamic": false,
"info": "Descriptive text detailing task's purpose and execution.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1075,7 +1180,9 @@
"display_name": "Tools",
"dynamic": false,
"info": "List of tools/resources limited for task execution. Uses the Agent tools by default.",
"input_types": ["Tool"],
"input_types": [
"Tool"
],
"list": true,
"name": "tools",
"placeholder": "",
@ -1111,7 +1218,9 @@
"display_name": "CrewAI Agent",
"id": "CrewAIAgentComponent-EbpXd",
"node": {
"base_classes": ["Agent"],
"base_classes": [
"Agent"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -1142,7 +1251,9 @@
"method": "build_output",
"name": "output",
"selected": "Agent",
"types": ["Agent"],
"types": [
"Agent"
],
"value": "__UNDEFINED__"
}
],
@ -1184,7 +1295,9 @@
"display_name": "Backstory",
"dynamic": false,
"info": "The backstory of the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1221,7 +1334,9 @@
"display_name": "Goal",
"dynamic": false,
"info": "The objective of the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1255,7 +1370,9 @@
"display_name": "Language Model",
"dynamic": false,
"info": "Language model that will run the agent.",
"input_types": ["LanguageModel"],
"input_types": [
"LanguageModel"
],
"list": false,
"name": "llm",
"placeholder": "",
@ -1286,7 +1403,9 @@
"display_name": "Role",
"dynamic": false,
"info": "The role of the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1305,7 +1424,9 @@
"display_name": "Tools",
"dynamic": false,
"info": "Tools at agents disposal",
"input_types": ["Tool"],
"input_types": [
"Tool"
],
"list": true,
"name": "tools",
"placeholder": "",
@ -1356,7 +1477,9 @@
"display_name": "CrewAI Agent",
"id": "CrewAIAgentComponent-9D8ao",
"node": {
"base_classes": ["Agent"],
"base_classes": [
"Agent"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -1387,7 +1510,9 @@
"method": "build_output",
"name": "output",
"selected": "Agent",
"types": ["Agent"],
"types": [
"Agent"
],
"value": "__UNDEFINED__"
}
],
@ -1429,7 +1554,9 @@
"display_name": "Backstory",
"dynamic": false,
"info": "The backstory of the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1466,7 +1593,9 @@
"display_name": "Goal",
"dynamic": false,
"info": "The objective of the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1500,7 +1629,9 @@
"display_name": "Language Model",
"dynamic": false,
"info": "Language model that will run the agent.",
"input_types": ["LanguageModel"],
"input_types": [
"LanguageModel"
],
"list": false,
"name": "llm",
"placeholder": "",
@ -1531,7 +1662,9 @@
"display_name": "Role",
"dynamic": false,
"info": "The role of the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1550,7 +1683,9 @@
"display_name": "Tools",
"dynamic": false,
"info": "Tools at agents disposal",
"input_types": ["Tool"],
"input_types": [
"Tool"
],
"list": true,
"name": "tools",
"placeholder": "",
@ -1601,7 +1736,10 @@
"display_name": "OpenAI",
"id": "OpenAIModel-HgNnu",
"node": {
"base_classes": ["LanguageModel", "Message"],
"base_classes": [
"LanguageModel",
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -1633,9 +1771,15 @@
"display_name": "Text",
"method": "text_response",
"name": "text_output",
"required_inputs": ["input_value", "stream", "system_message"],
"required_inputs": [
"input_value",
"stream",
"system_message"
],
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
},
{
@ -1655,7 +1799,9 @@
"temperature"
],
"selected": "LanguageModel",
"types": ["LanguageModel"],
"types": [
"LanguageModel"
],
"value": "__UNDEFINED__"
}
],
@ -1667,7 +1813,9 @@
"display_name": "OpenAI API Key",
"dynamic": false,
"info": "The OpenAI API Key to use for the OpenAI model.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"load_from_db": true,
"name": "api_key",
"password": true,
@ -1701,7 +1849,9 @@
"display_name": "Input",
"dynamic": false,
"info": "",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -1805,7 +1955,9 @@
"display_name": "Output Parser",
"dynamic": false,
"info": "The parser to use to parse the output of the model",
"input_types": ["OutputParser"],
"input_types": [
"OutputParser"
],
"list": false,
"name": "output_parser",
"placeholder": "",
@ -1866,7 +2018,9 @@
"display_name": "System Message",
"dynamic": false,
"info": "System message to pass to the model.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "system_message",
@ -1919,18 +2073,24 @@
"display_name": "Prompt",
"id": "Prompt-eqGhn",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {
"template": ["query"]
"template": [
"query"
]
},
"description": "Create a prompt template with dynamic variables.",
"display_name": "Prompt",
"documentation": "",
"edited": false,
"error": null,
"field_order": ["template"],
"field_order": [
"template"
],
"frozen": false,
"full_path": null,
"icon": "prompts",
@ -1947,7 +2107,9 @@
"method": "build_prompt",
"name": "prompt",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -1980,7 +2142,10 @@
"fileTypes": [],
"file_path": "",
"info": "",
"input_types": ["Message", "Text"],
"input_types": [
"Message",
"Text"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -2031,7 +2196,9 @@
"data": {
"id": "ChatInput-xgRl9",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -2058,7 +2225,9 @@
"method": "message_response",
"name": "message",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -2081,7 +2250,7 @@
"show": true,
"title_case": false,
"type": "code",
"value": "from langflow.base.data.utils import IMG_FILE_TYPES, TEXT_FILE_TYPES\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, FileInput, MessageTextInput, MultilineInput, Output\nfrom langflow.memory import store_message\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_USER, MESSAGE_SENDER_USER\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n name = \"ChatInput\"\n\n inputs = [\n MultilineInput(\n name=\"input_value\",\n display_name=\"Text\",\n value=\"\",\n info=\"Message to be passed as input.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_USER,\n info=\"Type of sender.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_USER,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n FileInput(\n name=\"files\",\n display_name=\"Files\",\n file_types=TEXT_FILE_TYPES + IMG_FILE_TYPES,\n info=\"Files to be sent with the message.\",\n advanced=True,\n is_list=True,\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n files=self.files,\n )\n\n if (\n self.session_id\n and isinstance(message, Message)\n and isinstance(message.text, str)\n and self.should_store_message\n ):\n store_message(\n message,\n flow_id=self.graph.flow_id,\n )\n self.message.value = message\n\n self.status = message\n return message\n"
"value": "from langflow.base.data.utils import IMG_FILE_TYPES, TEXT_FILE_TYPES\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, FileInput, MessageTextInput, MultilineInput, Output\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_USER, MESSAGE_SENDER_USER\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n name = \"ChatInput\"\n\n inputs = [\n MultilineInput(\n name=\"input_value\",\n display_name=\"Text\",\n value=\"\",\n info=\"Message to be passed as input.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_USER,\n info=\"Type of sender.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_USER,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n FileInput(\n name=\"files\",\n display_name=\"Files\",\n file_types=TEXT_FILE_TYPES + IMG_FILE_TYPES,\n info=\"Files to be sent with the message.\",\n advanced=True,\n is_list=True,\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n files=self.files,\n )\n if self.session_id and isinstance(message, Message) and self.should_store_message:\n stored_message = self.store_message(\n message,\n )\n self.message.value = stored_message\n message = stored_message\n\n self.status = message\n return message\n"
},
"files": {
"advanced": true,
@ -2129,7 +2298,9 @@
"display_name": "Text",
"dynamic": false,
"info": "Message to be passed as input.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -2149,7 +2320,10 @@
"dynamic": false,
"info": "Type of sender.",
"name": "sender",
"options": ["Machine", "User"],
"options": [
"Machine",
"User"
],
"placeholder": "",
"required": false,
"show": true,
@ -2163,7 +2337,9 @@
"display_name": "Sender Name",
"dynamic": false,
"info": "Name of the sender.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "sender_name",
@ -2181,7 +2357,9 @@
"display_name": "Session ID",
"dynamic": false,
"info": "The session ID of the chat. If empty, the current session ID parameter will be used.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "session_id",
@ -2234,7 +2412,9 @@
"display_name": "CrewAI Agent",
"id": "CrewAIAgentComponent-UMpxO",
"node": {
"base_classes": ["Agent"],
"base_classes": [
"Agent"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -2265,7 +2445,9 @@
"method": "build_output",
"name": "output",
"selected": "Agent",
"types": ["Agent"],
"types": [
"Agent"
],
"value": "__UNDEFINED__"
}
],
@ -2307,7 +2489,9 @@
"display_name": "Backstory",
"dynamic": false,
"info": "The backstory of the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -2344,7 +2528,9 @@
"display_name": "Goal",
"dynamic": false,
"info": "The objective of the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -2378,7 +2564,9 @@
"display_name": "Language Model",
"dynamic": false,
"info": "Language model that will run the agent.",
"input_types": ["LanguageModel"],
"input_types": [
"LanguageModel"
],
"list": false,
"name": "llm",
"placeholder": "",
@ -2409,7 +2597,9 @@
"display_name": "Role",
"dynamic": false,
"info": "The role of the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -2428,7 +2618,9 @@
"display_name": "Tools",
"dynamic": false,
"info": "Tools at agents disposal",
"input_types": ["Tool"],
"input_types": [
"Tool"
],
"list": true,
"name": "tools",
"placeholder": "",
@ -2477,7 +2669,10 @@
"data": {
"id": "SearchAPI-Yokat",
"node": {
"base_classes": ["Data", "Tool"],
"base_classes": [
"Data",
"Tool"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -2509,7 +2704,9 @@
"search_params"
],
"selected": "Data",
"types": ["Data"],
"types": [
"Data"
],
"value": "__UNDEFINED__"
},
{
@ -2526,7 +2723,9 @@
"search_params"
],
"selected": "Tool",
"types": ["Tool"],
"types": [
"Tool"
],
"value": "__UNDEFINED__"
}
],
@ -2538,7 +2737,9 @@
"display_name": "SearchAPI API Key",
"dynamic": false,
"info": "",
"input_types": ["Message"],
"input_types": [
"Message"
],
"load_from_db": true,
"name": "api_key",
"password": true,
@ -2572,7 +2773,9 @@
"display_name": "Engine",
"dynamic": false,
"info": "",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "engine",
@ -2590,7 +2793,9 @@
"display_name": "Input",
"dynamic": false,
"info": "",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -2689,4 +2894,5 @@
"openai",
"chatbots"
]
}

View file

@ -8,12 +8,17 @@
"dataType": "ChatInput",
"id": "ChatInput-6yuNd",
"name": "message",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "user_message",
"id": "Prompt-tifRl",
"inputTypes": ["Message", "Text"],
"inputTypes": [
"Message",
"Text"
],
"type": "str"
}
},
@ -30,12 +35,16 @@
"dataType": "Prompt",
"id": "Prompt-tifRl",
"name": "prompt",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "input_value",
"id": "OpenAIModel-ZIeE0",
"inputTypes": ["Message"],
"inputTypes": [
"Message"
],
"type": "str"
}
},
@ -52,12 +61,16 @@
"dataType": "OpenAIModel",
"id": "OpenAIModel-ZIeE0",
"name": "text_output",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "input_value",
"id": "ChatOutput-c3v9q",
"inputTypes": ["Message"],
"inputTypes": [
"Message"
],
"type": "str"
}
},
@ -74,12 +87,17 @@
"dataType": "Memory",
"id": "Memory-6s5g1",
"name": "messages_text",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "context",
"id": "Prompt-tifRl",
"inputTypes": ["Message", "Text"],
"inputTypes": [
"Message",
"Text"
],
"type": "str"
}
},
@ -97,17 +115,24 @@
"display_name": "Prompt",
"id": "Prompt-tifRl",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {
"template": ["context", "user_message"]
"template": [
"context",
"user_message"
]
},
"description": "Create a prompt template with dynamic variables.",
"display_name": "Prompt",
"documentation": "",
"edited": false,
"field_order": ["template"],
"field_order": [
"template"
],
"frozen": false,
"icon": "prompts",
"metadata": {},
@ -119,7 +144,9 @@
"method": "build_prompt",
"name": "prompt",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -152,7 +179,10 @@
"fileTypes": [],
"file_path": "",
"info": "",
"input_types": ["Message", "Text"],
"input_types": [
"Message",
"Text"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -189,7 +219,10 @@
"fileTypes": [],
"file_path": "",
"info": "",
"input_types": ["Message", "Text"],
"input_types": [
"Message",
"Text"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -227,7 +260,9 @@
"display_name": "Chat Input",
"id": "ChatInput-6yuNd",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -254,7 +289,9 @@
"method": "message_response",
"name": "message",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -277,7 +314,7 @@
"show": true,
"title_case": false,
"type": "code",
"value": "from langflow.base.data.utils import IMG_FILE_TYPES, TEXT_FILE_TYPES\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, FileInput, MessageTextInput, MultilineInput, Output\nfrom langflow.memory import store_message\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_USER, MESSAGE_SENDER_USER\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n name = \"ChatInput\"\n\n inputs = [\n MultilineInput(\n name=\"input_value\",\n display_name=\"Text\",\n value=\"\",\n info=\"Message to be passed as input.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_USER,\n info=\"Type of sender.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_USER,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n FileInput(\n name=\"files\",\n display_name=\"Files\",\n file_types=TEXT_FILE_TYPES + IMG_FILE_TYPES,\n info=\"Files to be sent with the message.\",\n advanced=True,\n is_list=True,\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n files=self.files,\n )\n\n if (\n self.session_id\n and isinstance(message, Message)\n and isinstance(message.text, str)\n and self.should_store_message\n ):\n store_message(\n message,\n flow_id=self.graph.flow_id,\n )\n self.message.value = message\n\n self.status = message\n return message\n"
"value": "from langflow.base.data.utils import IMG_FILE_TYPES, TEXT_FILE_TYPES\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, FileInput, MessageTextInput, MultilineInput, Output\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_USER, MESSAGE_SENDER_USER\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n name = \"ChatInput\"\n\n inputs = [\n MultilineInput(\n name=\"input_value\",\n display_name=\"Text\",\n value=\"\",\n info=\"Message to be passed as input.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_USER,\n info=\"Type of sender.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_USER,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n FileInput(\n name=\"files\",\n display_name=\"Files\",\n file_types=TEXT_FILE_TYPES + IMG_FILE_TYPES,\n info=\"Files to be sent with the message.\",\n advanced=True,\n is_list=True,\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n files=self.files,\n )\n if self.session_id and isinstance(message, Message) and self.should_store_message:\n stored_message = self.store_message(\n message,\n )\n self.message.value = stored_message\n message = stored_message\n\n self.status = message\n return message\n"
},
"files": {
"advanced": true,
@ -325,7 +362,9 @@
"display_name": "Text",
"dynamic": false,
"info": "Message to be passed as input.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -345,7 +384,10 @@
"dynamic": false,
"info": "Type of sender.",
"name": "sender",
"options": ["Machine", "User"],
"options": [
"Machine",
"User"
],
"placeholder": "",
"required": false,
"show": true,
@ -359,7 +401,9 @@
"display_name": "Sender Name",
"dynamic": false,
"info": "Name of the sender.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "sender_name",
@ -377,7 +421,9 @@
"display_name": "Session ID",
"dynamic": false,
"info": "The session ID of the chat. If empty, the current session ID parameter will be used.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "session_id",
@ -431,7 +477,10 @@
"display_name": "OpenAI",
"id": "OpenAIModel-ZIeE0",
"node": {
"base_classes": ["LanguageModel", "Message"],
"base_classes": [
"LanguageModel",
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -463,9 +512,15 @@
"display_name": "Text",
"method": "text_response",
"name": "text_output",
"required_inputs": ["input_value", "stream", "system_message"],
"required_inputs": [
"input_value",
"stream",
"system_message"
],
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
},
{
@ -485,7 +540,9 @@
"temperature"
],
"selected": "LanguageModel",
"types": ["LanguageModel"],
"types": [
"LanguageModel"
],
"value": "__UNDEFINED__"
}
],
@ -498,7 +555,9 @@
"display_name": "OpenAI API Key",
"dynamic": false,
"info": "The OpenAI API Key to use for the OpenAI model.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"load_from_db": true,
"name": "api_key",
"password": true,
@ -532,7 +591,9 @@
"display_name": "Input",
"dynamic": false,
"info": "",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -635,7 +696,9 @@
"display_name": "Output Parser",
"dynamic": false,
"info": "The parser to use to parse the output of the model",
"input_types": ["OutputParser"],
"input_types": [
"OutputParser"
],
"list": false,
"name": "output_parser",
"placeholder": "",
@ -747,7 +810,9 @@
"display_name": "Chat Output",
"id": "ChatOutput-c3v9q",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -774,7 +839,9 @@
"method": "message_response",
"name": "message",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -797,14 +864,16 @@
"show": true,
"title_case": false,
"type": "code",
"value": "from langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, MessageTextInput, Output\nfrom langflow.memory import store_message\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_AI, MESSAGE_SENDER_USER\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n name = \"ChatOutput\"\n\n inputs = [\n MessageTextInput(\n name=\"input_value\",\n display_name=\"Text\",\n info=\"Message to be passed as output.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_AI,\n advanced=True,\n info=\"Type of sender.\",\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_AI,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"data_template\",\n display_name=\"Data Template\",\n value=\"{text}\",\n advanced=True,\n info=\"Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.\",\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n )\n if (\n self.session_id\n and isinstance(message, Message)\n and isinstance(message.text, str)\n and self.should_store_message\n ):\n store_message(\n message,\n flow_id=self.graph.flow_id,\n )\n self.message.value = message\n\n self.status = message\n return message\n"
"value": "from langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, MessageTextInput, Output\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_AI, MESSAGE_SENDER_USER\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n name = \"ChatOutput\"\n\n inputs = [\n MessageTextInput(\n name=\"input_value\",\n display_name=\"Text\",\n info=\"Message to be passed as output.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_AI,\n advanced=True,\n info=\"Type of sender.\",\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_AI,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"data_template\",\n display_name=\"Data Template\",\n value=\"{text}\",\n advanced=True,\n info=\"Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.\",\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n )\n if self.session_id and isinstance(message, Message) and self.should_store_message:\n stored_message = self.store_message(\n message,\n )\n self.message.value = stored_message\n message = stored_message\n\n self.status = message\n return message\n"
},
"data_template": {
"advanced": true,
"display_name": "Data Template",
"dynamic": false,
"info": "Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "data_template",
@ -822,7 +891,9 @@
"display_name": "Text",
"dynamic": false,
"info": "Message to be passed as output.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -841,7 +912,10 @@
"dynamic": false,
"info": "Type of sender.",
"name": "sender",
"options": ["Machine", "User"],
"options": [
"Machine",
"User"
],
"placeholder": "",
"required": false,
"show": true,
@ -855,7 +929,9 @@
"display_name": "Sender Name",
"dynamic": false,
"info": "Name of the sender.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "sender_name",
@ -873,7 +949,9 @@
"display_name": "Session ID",
"dynamic": false,
"info": "The session ID of the chat. If empty, the current session ID parameter will be used.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "session_id",
@ -922,7 +1000,11 @@
"display_name": "Chat Memory",
"id": "Memory-6s5g1",
"node": {
"base_classes": ["BaseChatMemory", "Data", "Message"],
"base_classes": [
"BaseChatMemory",
"Data",
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -950,7 +1032,9 @@
"method": "retrieve_messages",
"name": "messages",
"selected": "Data",
"types": ["Data"],
"types": [
"Data"
],
"value": "__UNDEFINED__"
},
{
@ -959,7 +1043,9 @@
"method": "retrieve_messages_as_text",
"name": "messages_text",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
},
{
@ -968,7 +1054,9 @@
"method": "build_lc_memory",
"name": "lc_memory",
"selected": "BaseChatMemory",
"types": ["BaseChatMemory"],
"types": [
"BaseChatMemory"
],
"value": "__UNDEFINED__"
}
],
@ -998,7 +1086,9 @@
"display_name": "External Memory",
"dynamic": false,
"info": "Retrieve messages from an external memory. If empty, it will use the Langflow tables.",
"input_types": ["BaseChatMessageHistory"],
"input_types": [
"BaseChatMessageHistory"
],
"list": false,
"name": "memory",
"placeholder": "",
@ -1030,7 +1120,10 @@
"dynamic": false,
"info": "Order of the messages.",
"name": "order",
"options": ["Ascending", "Descending"],
"options": [
"Ascending",
"Descending"
],
"placeholder": "",
"required": false,
"show": true,
@ -1045,7 +1138,11 @@
"dynamic": false,
"info": "Filter by sender type.",
"name": "sender",
"options": ["Machine", "User", "Machine and User"],
"options": [
"Machine",
"User",
"Machine and User"
],
"placeholder": "",
"required": false,
"show": true,
@ -1059,7 +1156,9 @@
"display_name": "Sender Name",
"dynamic": false,
"info": "Filter by sender name.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "sender_name",
@ -1077,7 +1176,9 @@
"display_name": "Session ID",
"dynamic": false,
"info": "The session ID of the chat. If empty, the current session ID parameter will be used.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "session_id",
@ -1095,7 +1196,9 @@
"display_name": "Template",
"dynamic": false,
"info": "The template to use for formatting the data. It can contain the keys {text}, {sender} or any other key in the message data.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,

View file

@ -8,12 +8,16 @@
"dataType": "SequentialCrewComponent",
"id": "SequentialCrewComponent-3dbbB",
"name": "output",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "input_value",
"id": "ChatOutput-nwCjg",
"inputTypes": ["Message"],
"inputTypes": [
"Message"
],
"type": "str"
}
},
@ -30,12 +34,17 @@
"dataType": "TextInput",
"id": "TextInput-6QUGr",
"name": "text",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "topic",
"id": "Prompt-GOdlL",
"inputTypes": ["Message", "Text"],
"inputTypes": [
"Message",
"Text"
],
"type": "str"
}
},
@ -52,12 +61,17 @@
"dataType": "TextInput",
"id": "TextInput-6QUGr",
"name": "text",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "topic",
"id": "Prompt-824D7",
"inputTypes": ["Message", "Text"],
"inputTypes": [
"Message",
"Text"
],
"type": "str"
}
},
@ -74,12 +88,17 @@
"dataType": "TextInput",
"id": "TextInput-6QUGr",
"name": "text",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "topic",
"id": "Prompt-0vHob",
"inputTypes": ["Message", "Text"],
"inputTypes": [
"Message",
"Text"
],
"type": "str"
}
},
@ -96,12 +115,16 @@
"dataType": "Prompt",
"id": "Prompt-GOdlL",
"name": "prompt",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "task_description",
"id": "SequentialTaskAgentComponent-GWMA1",
"inputTypes": ["Message"],
"inputTypes": [
"Message"
],
"type": "str"
}
},
@ -118,12 +141,16 @@
"dataType": "OpenAIModel",
"id": "OpenAIModel-lQ5HF",
"name": "model_output",
"output_types": ["LanguageModel"]
"output_types": [
"LanguageModel"
]
},
"targetHandle": {
"fieldName": "llm",
"id": "SequentialTaskAgentComponent-GWMA1",
"inputTypes": ["LanguageModel"],
"inputTypes": [
"LanguageModel"
],
"type": "other"
}
},
@ -139,12 +166,16 @@
"dataType": "OpenAIModel",
"id": "OpenAIModel-lQ5HF",
"name": "model_output",
"output_types": ["LanguageModel"]
"output_types": [
"LanguageModel"
]
},
"targetHandle": {
"fieldName": "llm",
"id": "SequentialTaskAgentComponent-5i4Wg",
"inputTypes": ["LanguageModel"],
"inputTypes": [
"LanguageModel"
],
"type": "other"
}
},
@ -160,12 +191,16 @@
"dataType": "Prompt",
"id": "Prompt-824D7",
"name": "prompt",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "task_description",
"id": "SequentialTaskAgentComponent-5i4Wg",
"inputTypes": ["Message"],
"inputTypes": [
"Message"
],
"type": "str"
}
},
@ -181,12 +216,16 @@
"dataType": "SequentialTaskAgentComponent",
"id": "SequentialTaskAgentComponent-GWMA1",
"name": "task_output",
"output_types": ["SequentialTask"]
"output_types": [
"SequentialTask"
]
},
"targetHandle": {
"fieldName": "previous_task",
"id": "SequentialTaskAgentComponent-5i4Wg",
"inputTypes": ["SequentialTask"],
"inputTypes": [
"SequentialTask"
],
"type": "other"
}
},
@ -202,12 +241,16 @@
"dataType": "SequentialTaskAgentComponent",
"id": "SequentialTaskAgentComponent-5i4Wg",
"name": "task_output",
"output_types": ["SequentialTask"]
"output_types": [
"SequentialTask"
]
},
"targetHandle": {
"fieldName": "previous_task",
"id": "SequentialTaskAgentComponent-TPEWE",
"inputTypes": ["SequentialTask"],
"inputTypes": [
"SequentialTask"
],
"type": "other"
}
},
@ -223,12 +266,16 @@
"dataType": "Prompt",
"id": "Prompt-0vHob",
"name": "prompt",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "task_description",
"id": "SequentialTaskAgentComponent-TPEWE",
"inputTypes": ["Message"],
"inputTypes": [
"Message"
],
"type": "str"
}
},
@ -244,12 +291,16 @@
"dataType": "SequentialTaskAgentComponent",
"id": "SequentialTaskAgentComponent-TPEWE",
"name": "task_output",
"output_types": ["SequentialTask"]
"output_types": [
"SequentialTask"
]
},
"targetHandle": {
"fieldName": "tasks",
"id": "SequentialCrewComponent-3dbbB",
"inputTypes": ["SequentialTask"],
"inputTypes": [
"SequentialTask"
],
"type": "other"
}
},
@ -265,12 +316,16 @@
"dataType": "OpenAIModel",
"id": "OpenAIModel-lQ5HF",
"name": "model_output",
"output_types": ["LanguageModel"]
"output_types": [
"LanguageModel"
]
},
"targetHandle": {
"fieldName": "llm",
"id": "SequentialTaskAgentComponent-TPEWE",
"inputTypes": ["LanguageModel"],
"inputTypes": [
"LanguageModel"
],
"type": "other"
}
},
@ -286,12 +341,16 @@
"dataType": "YFinanceTool",
"id": "YFinanceTool-Asoka",
"name": "tool",
"output_types": ["Tool"]
"output_types": [
"Tool"
]
},
"targetHandle": {
"fieldName": "tools",
"id": "SequentialTaskAgentComponent-GWMA1",
"inputTypes": ["Tool"],
"inputTypes": [
"Tool"
],
"type": "other"
}
},
@ -309,7 +368,9 @@
"display_name": "Sequential Crew",
"id": "SequentialCrewComponent-3dbbB",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -339,7 +400,9 @@
"name": "output",
"required_inputs": [],
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -369,7 +432,9 @@
"display_name": "Function Calling LLM",
"dynamic": false,
"info": "Turns the ReAct CrewAI agent into a function-calling agent",
"input_types": ["LanguageModel"],
"input_types": [
"LanguageModel"
],
"list": false,
"name": "function_calling_llm",
"placeholder": "",
@ -430,7 +495,9 @@
"display_name": "Tasks",
"dynamic": false,
"info": "",
"input_types": ["SequentialTask"],
"input_types": [
"SequentialTask"
],
"list": true,
"name": "tasks",
"placeholder": "",
@ -494,7 +561,10 @@
"data": {
"id": "OpenAIModel-lQ5HF",
"node": {
"base_classes": ["LanguageModel", "Message"],
"base_classes": [
"LanguageModel",
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -527,9 +597,15 @@
"display_name": "Text",
"method": "text_response",
"name": "text_output",
"required_inputs": ["input_value", "stream", "system_message"],
"required_inputs": [
"input_value",
"stream",
"system_message"
],
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
},
{
@ -549,7 +625,9 @@
"temperature"
],
"selected": "LanguageModel",
"types": ["LanguageModel"],
"types": [
"LanguageModel"
],
"value": "__UNDEFINED__"
}
],
@ -562,7 +640,9 @@
"display_name": "OpenAI API Key",
"dynamic": false,
"info": "The OpenAI API Key to use for the OpenAI model.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"load_from_db": true,
"name": "api_key",
"password": true,
@ -596,7 +676,9 @@
"display_name": "Input",
"dynamic": false,
"info": "",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -699,7 +781,9 @@
"display_name": "Output Parser",
"dynamic": false,
"info": "The parser to use to parse the output of the model",
"input_types": ["OutputParser"],
"input_types": [
"OutputParser"
],
"list": false,
"name": "output_parser",
"placeholder": "",
@ -811,7 +895,9 @@
"display_name": "Chat Output",
"id": "ChatOutput-nwCjg",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -839,7 +925,9 @@
"method": "message_response",
"name": "message",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -862,14 +950,16 @@
"show": true,
"title_case": false,
"type": "code",
"value": "from langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, MessageTextInput, Output\nfrom langflow.memory import store_message\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_AI, MESSAGE_SENDER_USER\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n name = \"ChatOutput\"\n\n inputs = [\n MessageTextInput(\n name=\"input_value\",\n display_name=\"Text\",\n info=\"Message to be passed as output.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_AI,\n advanced=True,\n info=\"Type of sender.\",\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_AI,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"data_template\",\n display_name=\"Data Template\",\n value=\"{text}\",\n advanced=True,\n info=\"Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.\",\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n )\n if (\n self.session_id\n and isinstance(message, Message)\n and isinstance(message.text, str)\n and self.should_store_message\n ):\n store_message(\n message,\n flow_id=self.graph.flow_id,\n )\n self.message.value = message\n\n self.status = message\n return message\n"
"value": "from langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, MessageTextInput, Output\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_AI, MESSAGE_SENDER_USER\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n name = \"ChatOutput\"\n\n inputs = [\n MessageTextInput(\n name=\"input_value\",\n display_name=\"Text\",\n info=\"Message to be passed as output.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_AI,\n advanced=True,\n info=\"Type of sender.\",\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_AI,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"data_template\",\n display_name=\"Data Template\",\n value=\"{text}\",\n advanced=True,\n info=\"Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.\",\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n )\n if self.session_id and isinstance(message, Message) and self.should_store_message:\n stored_message = self.store_message(\n message,\n )\n self.message.value = stored_message\n message = stored_message\n\n self.status = message\n return message\n"
},
"data_template": {
"advanced": true,
"display_name": "Data Template",
"dynamic": false,
"info": "Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "data_template",
@ -887,7 +977,9 @@
"display_name": "Text",
"dynamic": false,
"info": "Message to be passed as output.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -906,7 +998,10 @@
"dynamic": false,
"info": "Type of sender.",
"name": "sender",
"options": ["Machine", "User"],
"options": [
"Machine",
"User"
],
"placeholder": "",
"required": false,
"show": true,
@ -920,7 +1015,9 @@
"display_name": "Sender Name",
"dynamic": false,
"info": "Name of the sender.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "sender_name",
@ -938,7 +1035,9 @@
"display_name": "Session ID",
"dynamic": false,
"info": "The session ID of the chat. If empty, the current session ID parameter will be used.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "session_id",
@ -985,7 +1084,9 @@
"data": {
"id": "TextInput-6QUGr",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -993,7 +1094,9 @@
"display_name": "Topic",
"documentation": "",
"edited": false,
"field_order": ["input_value"],
"field_order": [
"input_value"
],
"frozen": false,
"icon": "type",
"lf_version": "1.0.15",
@ -1005,7 +1108,9 @@
"method": "text_response",
"name": "text",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -1035,7 +1140,9 @@
"display_name": "Text",
"dynamic": false,
"info": "Text to be passed as input.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -1073,17 +1180,23 @@
"display_name": "Prompt",
"id": "Prompt-GOdlL",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {
"template": ["topic"]
"template": [
"topic"
]
},
"description": "Create a prompt template with dynamic variables.",
"display_name": "Prompt",
"documentation": "",
"edited": false,
"field_order": ["template"],
"field_order": [
"template"
],
"frozen": false,
"icon": "prompts",
"lf_version": "1.0.15",
@ -1096,7 +1209,9 @@
"method": "build_prompt",
"name": "prompt",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -1145,7 +1260,10 @@
"fileTypes": [],
"file_path": "",
"info": "",
"input_types": ["Message", "Text"],
"input_types": [
"Message",
"Text"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1183,17 +1301,23 @@
"display_name": "Prompt",
"id": "Prompt-824D7",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {
"template": ["topic"]
"template": [
"topic"
]
},
"description": "Create a prompt template with dynamic variables.",
"display_name": "Prompt",
"documentation": "",
"edited": false,
"field_order": ["template"],
"field_order": [
"template"
],
"frozen": false,
"icon": "prompts",
"lf_version": "1.0.15",
@ -1206,7 +1330,9 @@
"method": "build_prompt",
"name": "prompt",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -1255,7 +1381,10 @@
"fileTypes": [],
"file_path": "",
"info": "",
"input_types": ["Message", "Text"],
"input_types": [
"Message",
"Text"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1293,17 +1422,23 @@
"display_name": "Prompt",
"id": "Prompt-0vHob",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {
"template": ["topic"]
"template": [
"topic"
]
},
"description": "Create a prompt template with dynamic variables.",
"display_name": "Prompt",
"documentation": "",
"edited": false,
"field_order": ["template"],
"field_order": [
"template"
],
"frozen": false,
"icon": "prompts",
"lf_version": "1.0.15",
@ -1316,7 +1451,9 @@
"method": "build_prompt",
"name": "prompt",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -1365,7 +1502,10 @@
"fileTypes": [],
"file_path": "",
"info": "",
"input_types": ["Message", "Text"],
"input_types": [
"Message",
"Text"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1401,7 +1541,9 @@
"data": {
"id": "SequentialTaskAgentComponent-GWMA1",
"node": {
"base_classes": ["SequentialTask"],
"base_classes": [
"SequentialTask"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -1437,7 +1579,9 @@
"method": "build_agent_and_task",
"name": "task_output",
"selected": "SequentialTask",
"types": ["SequentialTask"],
"types": [
"SequentialTask"
],
"value": "__UNDEFINED__"
}
],
@ -1514,7 +1658,9 @@
"display_name": "Backstory",
"dynamic": false,
"info": "The backstory of the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1552,7 +1698,9 @@
"display_name": "Expected Task Output",
"dynamic": false,
"info": "Clear definition of expected task outcome.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1572,7 +1720,9 @@
"display_name": "Goal",
"dynamic": false,
"info": "The objective of the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1592,7 +1742,9 @@
"display_name": "Language Model",
"dynamic": false,
"info": "Language model that will run the agent.",
"input_types": ["LanguageModel"],
"input_types": [
"LanguageModel"
],
"list": false,
"name": "llm",
"placeholder": "",
@ -1625,7 +1777,9 @@
"display_name": "Previous Task",
"dynamic": false,
"info": "The previous task in the sequence (for chaining).",
"input_types": ["SequentialTask"],
"input_types": [
"SequentialTask"
],
"list": false,
"name": "previous_task",
"placeholder": "",
@ -1642,7 +1796,9 @@
"display_name": "Role",
"dynamic": false,
"info": "The role of the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1662,7 +1818,9 @@
"display_name": "Task Description",
"dynamic": false,
"info": "Descriptive text detailing task's purpose and execution.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1682,7 +1840,9 @@
"display_name": "Tools",
"dynamic": false,
"info": "Tools at agent's disposal",
"input_types": ["Tool"],
"input_types": [
"Tool"
],
"list": true,
"name": "tools",
"placeholder": "",
@ -1732,7 +1892,9 @@
"data": {
"id": "SequentialTaskAgentComponent-5i4Wg",
"node": {
"base_classes": ["SequentialTask"],
"base_classes": [
"SequentialTask"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -1768,7 +1930,9 @@
"method": "build_agent_and_task",
"name": "task_output",
"selected": "SequentialTask",
"types": ["SequentialTask"],
"types": [
"SequentialTask"
],
"value": "__UNDEFINED__"
}
],
@ -1845,7 +2009,9 @@
"display_name": "Backstory",
"dynamic": false,
"info": "The backstory of the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1883,7 +2049,9 @@
"display_name": "Expected Task Output",
"dynamic": false,
"info": "Clear definition of expected task outcome.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1903,7 +2071,9 @@
"display_name": "Goal",
"dynamic": false,
"info": "The objective of the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1923,7 +2093,9 @@
"display_name": "Language Model",
"dynamic": false,
"info": "Language model that will run the agent.",
"input_types": ["LanguageModel"],
"input_types": [
"LanguageModel"
],
"list": false,
"name": "llm",
"placeholder": "",
@ -1956,7 +2128,9 @@
"display_name": "Previous Task",
"dynamic": false,
"info": "The previous task in the sequence (for chaining).",
"input_types": ["SequentialTask"],
"input_types": [
"SequentialTask"
],
"list": false,
"name": "previous_task",
"placeholder": "",
@ -1973,7 +2147,9 @@
"display_name": "Role",
"dynamic": false,
"info": "The role of the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1993,7 +2169,9 @@
"display_name": "Task Description",
"dynamic": false,
"info": "Descriptive text detailing task's purpose and execution.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -2013,7 +2191,9 @@
"display_name": "Tools",
"dynamic": false,
"info": "Tools at agent's disposal",
"input_types": ["Tool"],
"input_types": [
"Tool"
],
"list": true,
"name": "tools",
"placeholder": "",
@ -2063,7 +2243,9 @@
"data": {
"id": "SequentialTaskAgentComponent-TPEWE",
"node": {
"base_classes": ["SequentialTask"],
"base_classes": [
"SequentialTask"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -2099,7 +2281,9 @@
"method": "build_agent_and_task",
"name": "task_output",
"selected": "SequentialTask",
"types": ["SequentialTask"],
"types": [
"SequentialTask"
],
"value": "__UNDEFINED__"
}
],
@ -2176,7 +2360,9 @@
"display_name": "Backstory",
"dynamic": false,
"info": "The backstory of the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -2214,7 +2400,9 @@
"display_name": "Expected Task Output",
"dynamic": false,
"info": "Clear definition of expected task outcome.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -2234,7 +2422,9 @@
"display_name": "Goal",
"dynamic": false,
"info": "The objective of the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -2254,7 +2444,9 @@
"display_name": "Language Model",
"dynamic": false,
"info": "Language model that will run the agent.",
"input_types": ["LanguageModel"],
"input_types": [
"LanguageModel"
],
"list": false,
"name": "llm",
"placeholder": "",
@ -2287,7 +2479,9 @@
"display_name": "Previous Task",
"dynamic": false,
"info": "The previous task in the sequence (for chaining).",
"input_types": ["SequentialTask"],
"input_types": [
"SequentialTask"
],
"list": false,
"name": "previous_task",
"placeholder": "",
@ -2304,7 +2498,9 @@
"display_name": "Role",
"dynamic": false,
"info": "The role of the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -2324,7 +2520,9 @@
"display_name": "Task Description",
"dynamic": false,
"info": "Descriptive text detailing task's purpose and execution.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -2344,7 +2542,9 @@
"display_name": "Tools",
"dynamic": false,
"info": "Tools at agent's disposal",
"input_types": ["Tool"],
"input_types": [
"Tool"
],
"list": true,
"name": "tools",
"placeholder": "",
@ -2394,7 +2594,9 @@
"data": {
"id": "YFinanceTool-Asoka",
"node": {
"base_classes": ["Tool"],
"base_classes": [
"Tool"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -2414,7 +2616,9 @@
"method": "run_model",
"name": "api_run_model",
"selected": "Data",
"types": ["Data"],
"types": [
"Data"
],
"value": "__UNDEFINED__"
},
{
@ -2423,7 +2627,9 @@
"method": "build_tool",
"name": "tool",
"selected": "Tool",
"types": ["Tool"],
"types": [
"Tool"
],
"value": "__UNDEFINED__"
}
],
@ -2454,7 +2660,9 @@
"display_name": "Query",
"dynamic": false,
"info": "Input should be a company ticker. For example, AAPL for Apple, MSFT for Microsoft.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",

View file

@ -8,12 +8,16 @@
"dataType": "OpenAIModel",
"id": "OpenAIModel-gRakF",
"name": "model_output",
"output_types": ["LanguageModel"]
"output_types": [
"LanguageModel"
]
},
"targetHandle": {
"fieldName": "llm",
"id": "ToolCallingAgent-0QzrL",
"inputTypes": ["LanguageModel"],
"inputTypes": [
"LanguageModel"
],
"type": "other"
}
},
@ -30,12 +34,16 @@
"dataType": "ChatInput",
"id": "ChatInput-uYdzQ",
"name": "message",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "input_value",
"id": "ToolCallingAgent-0QzrL",
"inputTypes": ["Message"],
"inputTypes": [
"Message"
],
"type": "str"
}
},
@ -52,12 +60,16 @@
"dataType": "ToolCallingAgent",
"id": "ToolCallingAgent-KLe5u",
"name": "response",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "input_value",
"id": "ToolCallingAgent-VYDK9",
"inputTypes": ["Message"],
"inputTypes": [
"Message"
],
"type": "str"
}
},
@ -74,12 +86,16 @@
"dataType": "ToolCallingAgent",
"id": "ToolCallingAgent-0QzrL",
"name": "response",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "input_value",
"id": "ToolCallingAgent-KLe5u",
"inputTypes": ["Message"],
"inputTypes": [
"Message"
],
"type": "str"
}
},
@ -96,12 +112,17 @@
"dataType": "SearchAPI",
"id": "SearchAPI-I4yU0",
"name": "api_build_tool",
"output_types": ["Tool"]
"output_types": [
"Tool"
]
},
"targetHandle": {
"fieldName": "tools",
"id": "ToolCallingAgent-0QzrL",
"inputTypes": ["Tool", "BaseTool"],
"inputTypes": [
"Tool",
"BaseTool"
],
"type": "other"
}
},
@ -118,12 +139,17 @@
"dataType": "url_content_fetcher",
"id": "url_content_fetcher-1FugB",
"name": "api_build_tool",
"output_types": ["Tool"]
"output_types": [
"Tool"
]
},
"targetHandle": {
"fieldName": "tools",
"id": "ToolCallingAgent-0QzrL",
"inputTypes": ["Tool", "BaseTool"],
"inputTypes": [
"Tool",
"BaseTool"
],
"type": "other"
}
},
@ -140,12 +166,17 @@
"dataType": "SearchAPI",
"id": "SearchAPI-I4yU0",
"name": "api_build_tool",
"output_types": ["Tool"]
"output_types": [
"Tool"
]
},
"targetHandle": {
"fieldName": "tools",
"id": "ToolCallingAgent-KLe5u",
"inputTypes": ["Tool", "BaseTool"],
"inputTypes": [
"Tool",
"BaseTool"
],
"type": "other"
}
},
@ -162,12 +193,17 @@
"dataType": "url_content_fetcher",
"id": "url_content_fetcher-1FugB",
"name": "api_build_tool",
"output_types": ["Tool"]
"output_types": [
"Tool"
]
},
"targetHandle": {
"fieldName": "tools",
"id": "ToolCallingAgent-KLe5u",
"inputTypes": ["Tool", "BaseTool"],
"inputTypes": [
"Tool",
"BaseTool"
],
"type": "other"
}
},
@ -184,12 +220,17 @@
"dataType": "url_content_fetcher",
"id": "url_content_fetcher-1FugB",
"name": "api_build_tool",
"output_types": ["Tool"]
"output_types": [
"Tool"
]
},
"targetHandle": {
"fieldName": "tools",
"id": "ToolCallingAgent-VYDK9",
"inputTypes": ["Tool", "BaseTool"],
"inputTypes": [
"Tool",
"BaseTool"
],
"type": "other"
}
},
@ -206,12 +247,17 @@
"dataType": "SearchAPI",
"id": "SearchAPI-I4yU0",
"name": "api_build_tool",
"output_types": ["Tool"]
"output_types": [
"Tool"
]
},
"targetHandle": {
"fieldName": "tools",
"id": "ToolCallingAgent-VYDK9",
"inputTypes": ["Tool", "BaseTool"],
"inputTypes": [
"Tool",
"BaseTool"
],
"type": "other"
}
},
@ -228,12 +274,16 @@
"dataType": "ToolCallingAgent",
"id": "ToolCallingAgent-VYDK9",
"name": "response",
"output_types": ["Message"]
"output_types": [
"Message"
]
},
"targetHandle": {
"fieldName": "input_value",
"id": "ChatOutput-O63dG",
"inputTypes": ["Message"],
"inputTypes": [
"Message"
],
"type": "str"
}
},
@ -250,12 +300,16 @@
"dataType": "OpenAIModel",
"id": "OpenAIModel-gRakF",
"name": "model_output",
"output_types": ["LanguageModel"]
"output_types": [
"LanguageModel"
]
},
"targetHandle": {
"fieldName": "llm",
"id": "ToolCallingAgent-VYDK9",
"inputTypes": ["LanguageModel"],
"inputTypes": [
"LanguageModel"
],
"type": "other"
}
},
@ -272,12 +326,16 @@
"dataType": "OpenAIModel",
"id": "OpenAIModel-gRakF",
"name": "model_output",
"output_types": ["LanguageModel"]
"output_types": [
"LanguageModel"
]
},
"targetHandle": {
"fieldName": "llm",
"id": "ToolCallingAgent-KLe5u",
"inputTypes": ["LanguageModel"],
"inputTypes": [
"LanguageModel"
],
"type": "other"
}
},
@ -294,12 +352,17 @@
"dataType": "CalculatorTool",
"id": "CalculatorTool-5S6u9",
"name": "api_build_tool",
"output_types": ["Tool"]
"output_types": [
"Tool"
]
},
"targetHandle": {
"fieldName": "tools",
"id": "ToolCallingAgent-VYDK9",
"inputTypes": ["Tool", "BaseTool"],
"inputTypes": [
"Tool",
"BaseTool"
],
"type": "other"
}
},
@ -315,7 +378,9 @@
"data": {
"id": "ChatInput-uYdzQ",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -343,7 +408,9 @@
"method": "message_response",
"name": "message",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -366,7 +433,7 @@
"show": true,
"title_case": false,
"type": "code",
"value": "from langflow.base.data.utils import IMG_FILE_TYPES, TEXT_FILE_TYPES\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, FileInput, MessageTextInput, MultilineInput, Output\nfrom langflow.memory import store_message\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_USER, MESSAGE_SENDER_USER\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n name = \"ChatInput\"\n\n inputs = [\n MultilineInput(\n name=\"input_value\",\n display_name=\"Text\",\n value=\"\",\n info=\"Message to be passed as input.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_USER,\n info=\"Type of sender.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_USER,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n FileInput(\n name=\"files\",\n display_name=\"Files\",\n file_types=TEXT_FILE_TYPES + IMG_FILE_TYPES,\n info=\"Files to be sent with the message.\",\n advanced=True,\n is_list=True,\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n files=self.files,\n )\n\n if (\n self.session_id\n and isinstance(message, Message)\n and isinstance(message.text, str)\n and self.should_store_message\n ):\n store_message(\n message,\n flow_id=self.graph.flow_id,\n )\n self.message.value = message\n\n self.status = message\n return message\n"
"value": "from langflow.base.data.utils import IMG_FILE_TYPES, TEXT_FILE_TYPES\nfrom langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, FileInput, MessageTextInput, MultilineInput, Output\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_USER, MESSAGE_SENDER_USER\n\n\nclass ChatInput(ChatComponent):\n display_name = \"Chat Input\"\n description = \"Get chat inputs from the Playground.\"\n icon = \"ChatInput\"\n name = \"ChatInput\"\n\n inputs = [\n MultilineInput(\n name=\"input_value\",\n display_name=\"Text\",\n value=\"\",\n info=\"Message to be passed as input.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_USER,\n info=\"Type of sender.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_USER,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n FileInput(\n name=\"files\",\n display_name=\"Files\",\n file_types=TEXT_FILE_TYPES + IMG_FILE_TYPES,\n info=\"Files to be sent with the message.\",\n advanced=True,\n is_list=True,\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n files=self.files,\n )\n if self.session_id and isinstance(message, Message) and self.should_store_message:\n stored_message = self.store_message(\n message,\n )\n self.message.value = stored_message\n message = stored_message\n\n self.status = message\n return message\n"
},
"files": {
"_input_type": "FileInput",
@ -416,7 +483,9 @@
"display_name": "Text",
"dynamic": false,
"info": "Message to be passed as input.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -438,7 +507,10 @@
"dynamic": false,
"info": "Type of sender.",
"name": "sender",
"options": ["Machine", "User"],
"options": [
"Machine",
"User"
],
"placeholder": "",
"required": false,
"show": true,
@ -453,7 +525,9 @@
"display_name": "Sender Name",
"dynamic": false,
"info": "Name of the sender.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "sender_name",
@ -472,7 +546,9 @@
"display_name": "Session ID",
"dynamic": false,
"info": "The session ID of the chat. If empty, the current session ID parameter will be used.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "session_id",
@ -524,7 +600,9 @@
"data": {
"id": "ChatOutput-O63dG",
"node": {
"base_classes": ["Message"],
"base_classes": [
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -552,7 +630,9 @@
"method": "message_response",
"name": "message",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -575,7 +655,7 @@
"show": true,
"title_case": false,
"type": "code",
"value": "from langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, MessageTextInput, Output\nfrom langflow.memory import store_message\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_AI, MESSAGE_SENDER_USER\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n name = \"ChatOutput\"\n\n inputs = [\n MessageTextInput(\n name=\"input_value\",\n display_name=\"Text\",\n info=\"Message to be passed as output.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_AI,\n advanced=True,\n info=\"Type of sender.\",\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_AI,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"data_template\",\n display_name=\"Data Template\",\n value=\"{text}\",\n advanced=True,\n info=\"Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.\",\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n )\n if (\n self.session_id\n and isinstance(message, Message)\n and isinstance(message.text, str)\n and self.should_store_message\n ):\n store_message(\n message,\n flow_id=self.graph.flow_id,\n )\n self.message.value = message\n\n self.status = message\n return message\n"
"value": "from langflow.base.io.chat import ChatComponent\nfrom langflow.inputs import BoolInput\nfrom langflow.io import DropdownInput, MessageTextInput, Output\nfrom langflow.schema.message import Message\nfrom langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_AI, MESSAGE_SENDER_USER\n\n\nclass ChatOutput(ChatComponent):\n display_name = \"Chat Output\"\n description = \"Display a chat message in the Playground.\"\n icon = \"ChatOutput\"\n name = \"ChatOutput\"\n\n inputs = [\n MessageTextInput(\n name=\"input_value\",\n display_name=\"Text\",\n info=\"Message to be passed as output.\",\n ),\n BoolInput(\n name=\"should_store_message\",\n display_name=\"Store Messages\",\n info=\"Store the message in the history.\",\n value=True,\n advanced=True,\n ),\n DropdownInput(\n name=\"sender\",\n display_name=\"Sender Type\",\n options=[MESSAGE_SENDER_AI, MESSAGE_SENDER_USER],\n value=MESSAGE_SENDER_AI,\n advanced=True,\n info=\"Type of sender.\",\n ),\n MessageTextInput(\n name=\"sender_name\",\n display_name=\"Sender Name\",\n info=\"Name of the sender.\",\n value=MESSAGE_SENDER_NAME_AI,\n advanced=True,\n ),\n MessageTextInput(\n name=\"session_id\",\n display_name=\"Session ID\",\n info=\"The session ID of the chat. If empty, the current session ID parameter will be used.\",\n advanced=True,\n ),\n MessageTextInput(\n name=\"data_template\",\n display_name=\"Data Template\",\n value=\"{text}\",\n advanced=True,\n info=\"Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.\",\n ),\n ]\n outputs = [\n Output(display_name=\"Message\", name=\"message\", method=\"message_response\"),\n ]\n\n def message_response(self) -> Message:\n message = Message(\n text=self.input_value,\n sender=self.sender,\n sender_name=self.sender_name,\n session_id=self.session_id,\n )\n if self.session_id and isinstance(message, Message) and self.should_store_message:\n stored_message = self.store_message(\n message,\n )\n self.message.value = stored_message\n message = stored_message\n\n self.status = message\n return message\n"
},
"data_template": {
"_input_type": "MessageTextInput",
@ -583,7 +663,9 @@
"display_name": "Data Template",
"dynamic": false,
"info": "Template to convert Data to Text. If left empty, it will be dynamically set to the Data's text key.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "data_template",
@ -602,7 +684,9 @@
"display_name": "Text",
"dynamic": false,
"info": "Message to be passed as output.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -623,7 +707,10 @@
"dynamic": false,
"info": "Type of sender.",
"name": "sender",
"options": ["Machine", "User"],
"options": [
"Machine",
"User"
],
"placeholder": "",
"required": false,
"show": true,
@ -638,7 +725,9 @@
"display_name": "Sender Name",
"dynamic": false,
"info": "Name of the sender.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "sender_name",
@ -657,7 +746,9 @@
"display_name": "Session ID",
"dynamic": false,
"info": "The session ID of the chat. If empty, the current session ID parameter will be used.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "session_id",
@ -711,7 +802,10 @@
"display_name": "OpenAI",
"id": "OpenAIModel-gRakF",
"node": {
"base_classes": ["LanguageModel", "Message"],
"base_classes": [
"LanguageModel",
"Message"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -743,9 +837,15 @@
"display_name": "Text",
"method": "text_response",
"name": "text_output",
"required_inputs": ["input_value", "stream", "system_message"],
"required_inputs": [
"input_value",
"stream",
"system_message"
],
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
},
{
@ -765,7 +865,9 @@
"temperature"
],
"selected": "LanguageModel",
"types": ["LanguageModel"],
"types": [
"LanguageModel"
],
"value": "__UNDEFINED__"
}
],
@ -778,7 +880,9 @@
"display_name": "OpenAI API Key",
"dynamic": false,
"info": "The OpenAI API Key to use for the OpenAI model.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"load_from_db": true,
"name": "api_key",
"password": true,
@ -813,7 +917,9 @@
"display_name": "Input",
"dynamic": false,
"info": "",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -929,7 +1035,9 @@
"display_name": "Output Parser",
"dynamic": false,
"info": "The parser to use to parse the output of the model",
"input_types": ["OutputParser"],
"input_types": [
"OutputParser"
],
"list": false,
"name": "output_parser",
"placeholder": "",
@ -994,7 +1102,9 @@
"display_name": "System Message",
"dynamic": false,
"info": "System message to pass to the model.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "system_message",
@ -1046,7 +1156,10 @@
"data": {
"id": "ToolCallingAgent-0QzrL",
"node": {
"base_classes": ["AgentExecutor", "Message"],
"base_classes": [
"AgentExecutor",
"Message"
],
"beta": true,
"conditional_paths": [],
"custom_fields": {},
@ -1076,7 +1189,9 @@
"method": "build_agent",
"name": "agent",
"selected": "AgentExecutor",
"types": ["AgentExecutor"],
"types": [
"AgentExecutor"
],
"value": "__UNDEFINED__"
},
{
@ -1085,7 +1200,9 @@
"method": "message_response",
"name": "response",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -1098,7 +1215,9 @@
"display_name": "Chat History",
"dynamic": false,
"info": "",
"input_types": ["Data"],
"input_types": [
"Data"
],
"list": true,
"name": "chat_history",
"placeholder": "",
@ -1150,7 +1269,9 @@
"display_name": "Input",
"dynamic": false,
"info": "",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -1169,7 +1290,9 @@
"display_name": "Language Model",
"dynamic": false,
"info": "",
"input_types": ["LanguageModel"],
"input_types": [
"LanguageModel"
],
"list": false,
"name": "llm",
"placeholder": "",
@ -1202,7 +1325,9 @@
"display_name": "System Prompt",
"dynamic": false,
"info": "System prompt for the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1222,7 +1347,10 @@
"display_name": "Tools",
"dynamic": false,
"info": "",
"input_types": ["Tool", "BaseTool"],
"input_types": [
"Tool",
"BaseTool"
],
"list": true,
"load_from_db": false,
"name": "tools",
@ -1240,7 +1368,9 @@
"display_name": "Prompt",
"dynamic": false,
"info": "This prompt must contain 'input' key.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1293,7 +1423,11 @@
"data": {
"id": "SearchAPI-I4yU0",
"node": {
"base_classes": ["Data", "list", "Tool"],
"base_classes": [
"Data",
"list",
"Tool"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -1327,7 +1461,9 @@
"search_params"
],
"selected": "Data",
"types": ["Data"],
"types": [
"Data"
],
"value": "__UNDEFINED__"
},
{
@ -1344,7 +1480,9 @@
"search_params"
],
"selected": "Tool",
"types": ["Tool"],
"types": [
"Tool"
],
"value": "__UNDEFINED__"
}
],
@ -1357,7 +1495,9 @@
"display_name": "SearchAPI API Key",
"dynamic": false,
"info": "",
"input_types": ["Message"],
"input_types": [
"Message"
],
"load_from_db": true,
"name": "api_key",
"password": true,
@ -1392,7 +1532,9 @@
"display_name": "Engine",
"dynamic": false,
"info": "",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "engine",
@ -1411,7 +1553,9 @@
"display_name": "Input",
"dynamic": false,
"info": "",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1496,7 +1640,11 @@
"data": {
"id": "url_content_fetcher-1FugB",
"node": {
"base_classes": ["Data", "list", "Tool"],
"base_classes": [
"Data",
"list",
"Tool"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -1504,7 +1652,10 @@
"display_name": "URL Content Fetcher",
"documentation": "https://python.langchain.com/docs/modules/data_connection/document_loaders/integrations/web_base",
"edited": true,
"field_order": ["url", "fetch_params"],
"field_order": [
"url",
"fetch_params"
],
"frozen": false,
"icon": "globe",
"lf_version": "1.0.15",
@ -1517,7 +1668,10 @@
"method": "run_model",
"name": "api_run_model",
"selected": "Data",
"types": ["Data", "list"],
"types": [
"Data",
"list"
],
"value": "__UNDEFINED__"
},
{
@ -1526,7 +1680,9 @@
"method": "build_tool",
"name": "api_build_tool",
"selected": "Tool",
"types": ["Tool"],
"types": [
"Tool"
],
"value": "__UNDEFINED__"
}
],
@ -1573,7 +1729,9 @@
"display_name": "URL",
"dynamic": false,
"info": "Enter a single URL to fetch content from.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "url",
@ -1609,7 +1767,10 @@
"data": {
"id": "ToolCallingAgent-KLe5u",
"node": {
"base_classes": ["AgentExecutor", "Message"],
"base_classes": [
"AgentExecutor",
"Message"
],
"beta": true,
"conditional_paths": [],
"custom_fields": {},
@ -1639,7 +1800,9 @@
"method": "build_agent",
"name": "agent",
"selected": "AgentExecutor",
"types": ["AgentExecutor"],
"types": [
"AgentExecutor"
],
"value": "__UNDEFINED__"
},
{
@ -1648,7 +1811,9 @@
"method": "message_response",
"name": "response",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -1661,7 +1826,9 @@
"display_name": "Chat History",
"dynamic": false,
"info": "",
"input_types": ["Data"],
"input_types": [
"Data"
],
"list": true,
"name": "chat_history",
"placeholder": "",
@ -1713,7 +1880,9 @@
"display_name": "Input",
"dynamic": false,
"info": "",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -1732,7 +1901,9 @@
"display_name": "Language Model",
"dynamic": false,
"info": "",
"input_types": ["LanguageModel"],
"input_types": [
"LanguageModel"
],
"list": false,
"name": "llm",
"placeholder": "",
@ -1765,7 +1936,9 @@
"display_name": "System Prompt",
"dynamic": false,
"info": "System prompt for the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1785,7 +1958,10 @@
"display_name": "Tools",
"dynamic": false,
"info": "",
"input_types": ["Tool", "BaseTool"],
"input_types": [
"Tool",
"BaseTool"
],
"list": true,
"load_from_db": false,
"name": "tools",
@ -1803,7 +1979,9 @@
"display_name": "Prompt",
"dynamic": false,
"info": "This prompt must contain 'input' key.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -1856,7 +2034,10 @@
"data": {
"id": "ToolCallingAgent-VYDK9",
"node": {
"base_classes": ["AgentExecutor", "Message"],
"base_classes": [
"AgentExecutor",
"Message"
],
"beta": true,
"conditional_paths": [],
"custom_fields": {},
@ -1886,7 +2067,9 @@
"method": "build_agent",
"name": "agent",
"selected": "AgentExecutor",
"types": ["AgentExecutor"],
"types": [
"AgentExecutor"
],
"value": "__UNDEFINED__"
},
{
@ -1895,7 +2078,9 @@
"method": "message_response",
"name": "response",
"selected": "Message",
"types": ["Message"],
"types": [
"Message"
],
"value": "__UNDEFINED__"
}
],
@ -1908,7 +2093,9 @@
"display_name": "Chat History",
"dynamic": false,
"info": "",
"input_types": ["Data"],
"input_types": [
"Data"
],
"list": true,
"name": "chat_history",
"placeholder": "",
@ -1960,7 +2147,9 @@
"display_name": "Input",
"dynamic": false,
"info": "",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "input_value",
@ -1979,7 +2168,9 @@
"display_name": "Language Model",
"dynamic": false,
"info": "",
"input_types": ["LanguageModel"],
"input_types": [
"LanguageModel"
],
"list": false,
"name": "llm",
"placeholder": "",
@ -2012,7 +2203,9 @@
"display_name": "System Prompt",
"dynamic": false,
"info": "System prompt for the agent.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -2032,7 +2225,10 @@
"display_name": "Tools",
"dynamic": false,
"info": "",
"input_types": ["Tool", "BaseTool"],
"input_types": [
"Tool",
"BaseTool"
],
"list": true,
"load_from_db": false,
"name": "tools",
@ -2050,7 +2246,9 @@
"display_name": "Prompt",
"dynamic": false,
"info": "This prompt must contain 'input' key.",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"multiline": true,
@ -2103,7 +2301,12 @@
"data": {
"id": "CalculatorTool-5S6u9",
"node": {
"base_classes": ["Data", "list", "Sequence", "Tool"],
"base_classes": [
"Data",
"list",
"Sequence",
"Tool"
],
"beta": false,
"conditional_paths": [],
"custom_fields": {},
@ -2111,7 +2314,9 @@
"display_name": "Calculator",
"documentation": "",
"edited": false,
"field_order": ["expression"],
"field_order": [
"expression"
],
"frozen": false,
"icon": "calculator",
"lf_version": "1.0.15",
@ -2124,9 +2329,13 @@
"display_name": "Data",
"method": "run_model",
"name": "api_run_model",
"required_inputs": ["expression"],
"required_inputs": [
"expression"
],
"selected": "Data",
"types": ["Data"],
"types": [
"Data"
],
"value": "__UNDEFINED__"
},
{
@ -2134,9 +2343,13 @@
"display_name": "Tool",
"method": "build_tool",
"name": "api_build_tool",
"required_inputs": ["expression"],
"required_inputs": [
"expression"
],
"selected": "Tool",
"types": ["Tool"],
"types": [
"Tool"
],
"value": "__UNDEFINED__"
}
],
@ -2167,7 +2380,9 @@
"display_name": "Expression",
"dynamic": false,
"info": "The arithmetic expression to evaluate (e.g., '4*4*(33/22)+12-20').",
"input_types": ["Message"],
"input_types": [
"Message"
],
"list": false,
"load_from_db": false,
"name": "expression",

View file

@ -36,7 +36,7 @@ def get_messages(
List[Data]: A list of Data objects representing the retrieved messages.
"""
with session_scope() as session:
stmt = select(MessageTable)
stmt = select(MessageTable).where(MessageTable.error == False) # noqa: E712
if sender:
stmt = stmt.where(MessageTable.sender == sender)
if sender_name:
@ -142,7 +142,7 @@ class LCBuiltinChatMemory(BaseChatMessageHistory):
messages = get_messages(
session_id=self.session_id,
)
return [m.to_lc_message() for m in messages]
return [m.to_lc_message() for m in messages if not m.error] # Exclude error messages
def add_messages(self, messages: Sequence[BaseMessage]) -> None:
for lc_message in messages:

View file

@ -54,6 +54,8 @@ class Message(Data):
default_factory=lambda: datetime.now(timezone.utc).strftime("%Y-%m-%d %H:%M:%S")
)
flow_id: str | UUID | None = None
error: bool = Field(default=False)
edit: bool = Field(default=False)
@field_validator("flow_id", mode="before")
@classmethod
@ -64,10 +66,14 @@ class Message(Data):
@field_serializer("flow_id")
def serialize_flow_id(value):
if isinstance(value, str):
return UUID(value)
if isinstance(value, UUID):
return str(value)
return value
@field_serializer("timestamp")
def serialize_timestamp(value):
return datetime.strptime(value, "%Y-%m-%d %H:%M:%S").astimezone(timezone.utc)
@field_validator("files", mode="before")
@classmethod
def validate_files(cls, value):
@ -154,6 +160,8 @@ class Message(Data):
session_id=data.session_id,
timestamp=data.timestamp,
flow_id=data.flow_id,
error=data.error,
edit=data.edit,
)
@field_serializer("text", mode="plain")

View file

@ -4,7 +4,7 @@ from langflow.services.database.models.message.model import MessageTable, Messag
from langflow.services.deps import session_scope
def update_message(message_id: UUID, message: MessageUpdate | dict):
def update_message(message_id: UUID | str, message: MessageUpdate | dict):
if not isinstance(message, MessageUpdate):
message = MessageUpdate(**message)
with session_scope() as session:

View file

@ -18,6 +18,8 @@ class MessageBase(SQLModel):
session_id: str
text: str = Field(sa_column=Column(Text))
files: list[str] = Field(default_factory=list)
error: bool = Field(default=False)
edit: bool = Field(default=False)
@field_validator("files", mode="before")
@classmethod
@ -100,3 +102,5 @@ class MessageUpdate(SQLModel):
sender_name: str | None = None
session_id: str | None = None
files: list[str] | None = None
edit: bool | None = None
error: bool | None = None

View file

@ -3,7 +3,7 @@ import { addPlusSignes, cn, sortShortcuts } from "@/utils/utils";
import RenderKey from "./components/renderKey";
export default function RenderIcons({
filteredShortcut,
filteredShortcut = [],
tableRender = false,
}: {
filteredShortcut: string[];

View file

@ -40,7 +40,6 @@ export const useGetMessagesQuery: useQueryFunctionType<
const data = await getMessagesFn(id, params);
const columns = extractColumnsFromRows(data.data, mode, excludedFields);
useMessagesStore.getState().setMessages(data.data);
useMessagesStore.getState().setColumns(columns);
return { rows: data, columns };
};

View file

@ -1,3 +1,4 @@
import useFlowsManagerStore from "@/stores/flowsManagerStore";
import { useMutationFunctionType } from "@/types/api";
import { Message } from "@/types/messages";
import { UseMutationResult } from "@tanstack/react-query";
@ -6,28 +7,45 @@ import { getURL } from "../../helpers/constants";
import { UseRequestProcessor } from "../../services/request-processor";
interface UpdateMessageParams {
message: Message;
message: Partial<Message>;
refetch?: boolean;
}
export const useUpdateMessage: useMutationFunctionType<
undefined,
UpdateMessageParams
> = (options?) => {
const { mutate } = UseRequestProcessor();
const { mutate, queryClient } = UseRequestProcessor();
const updateMessageApi = async (data: Message) => {
if (data.files && typeof data.files === "string") {
data.files = JSON.parse(data.files);
const updateMessageApi = async (data: UpdateMessageParams) => {
const message = data.message;
if (message.files && typeof message.files === "string") {
message.files = JSON.parse(message.files);
}
const result = await api.put(`${getURL("MESSAGES")}/${data.id}`, data);
const result = await api.put(
`${getURL("MESSAGES")}/${message.id}`,
message,
);
return result.data;
};
const mutation: UseMutationResult<
UpdateMessageParams,
any,
UpdateMessageParams
> = mutate(["useUpdateMessages"], updateMessageApi, options);
const mutation: UseMutationResult<Message, any, UpdateMessageParams> = mutate(
["useUpdateMessages"],
updateMessageApi,
{
...options,
onSettled: (_, __, params, ___) => {
const flowId = useFlowsManagerStore.getState().currentFlowId;
//@ts-ignore
if (params?.refetch && flowId) {
queryClient.refetchQueries({
queryKey: ["useGetMessagesQuery", { id: flowId }],
exact: true,
});
}
},
},
);
return mutation;
};

View file

@ -0,0 +1,42 @@
import { useMutationFunctionType } from "@/types/api";
import { Message } from "@/types/messages";
import { UseMutationResult } from "@tanstack/react-query";
import { api } from "../../api";
import { getURL } from "../../helpers/constants";
import { UseRequestProcessor } from "../../services/request-processor";
interface UpdateSessionParams {
old_session_id: string;
new_session_id: string;
}
export const useUpdateSessionName: useMutationFunctionType<
undefined,
UpdateSessionParams
> = (options?) => {
const { mutate, queryClient } = UseRequestProcessor();
const updateSessionApi = async (data: UpdateSessionParams) => {
const result = await api.patch(
`${getURL("MESSAGES")}/session/${data.old_session_id}`,
null,
{
params: { new_session_id: data.new_session_id },
},
);
return result.data;
};
const mutation: UseMutationResult<Message[], any, UpdateSessionParams> =
mutate(["useUpdateSessionName"], updateSessionApi, {
...options,
onSettled: (data, variables, context) => {
// Invalidate and refetch relevant queries
queryClient.refetchQueries({
queryKey: ["useGetMessagesQuery"],
});
},
});
return mutation;
};

View file

@ -0,0 +1,214 @@
import IconComponent from "@/components/genericIconComponent";
import ShadTooltip from "@/components/shadTooltipComponent";
import { Badge } from "@/components/ui/badge";
import { Input } from "@/components/ui/input";
import {
Select,
SelectContent,
SelectItem,
SelectTrigger,
} from "@/components/ui/select-custom";
import { useUpdateSessionName } from "@/controllers/API/queries/messages/use-rename-session";
import useFlowStore from "@/stores/flowStore";
import { cn } from "@/utils/utils";
import React, { useEffect, useRef, useState } from "react";
export default function SessionSelector({
deleteSession,
session,
toggleVisibility,
isVisible,
inspectSession,
updateVisibleSession,
selectedView,
setSelectedView,
}: {
deleteSession: (session: string) => void;
session: string;
toggleVisibility: () => void;
isVisible: boolean;
inspectSession: (session: string) => void;
updateVisibleSession: (session: string) => void;
selectedView?: { type: string; id: string };
setSelectedView: (view: { type: string; id: string } | undefined) => void;
}) {
const currentFlowId = useFlowStore((state) => state.currentFlow?.id);
const [isEditing, setIsEditing] = useState(false);
const [editedSession, setEditedSession] = useState(session);
const { mutate: updateSessionName } = useUpdateSessionName();
const inputRef = useRef<HTMLInputElement>(null);
useEffect(() => {
setEditedSession(session);
}, [session]);
const handleEditClick = (e?: React.MouseEvent<HTMLDivElement>) => {
e?.stopPropagation();
setIsEditing(true);
};
const handleInputChange = (e: React.ChangeEvent<HTMLInputElement>) => {
setEditedSession(e.target.value);
};
const handleConfirm = () => {
setIsEditing(false);
if (editedSession.trim() !== session) {
updateSessionName(
{ old_session_id: session, new_session_id: editedSession.trim() },
{
onSuccess: () => {
if (isVisible) {
updateVisibleSession(editedSession);
}
if (
selectedView?.type === "Session" &&
selectedView?.id === session
) {
setSelectedView({ type: "Session", id: editedSession });
}
},
},
);
}
};
const handleCancel = () => {
setIsEditing(false);
setEditedSession(session);
};
const handleSelectChange = (value: string) => {
switch (value) {
case "rename":
handleEditClick();
break;
case "messageLogs":
inspectSession(session);
break;
case "delete":
deleteSession(session);
break;
}
};
return (
<div
data-testid="session-selector"
onClick={(e) => {
if (isEditing) e.stopPropagation();
else toggleVisibility();
}}
className={cn(
"file-component-accordion-div group cursor-pointer rounded-md hover:bg-muted-foreground/30",
isVisible ? "bg-muted-foreground/15" : "",
)}
>
<div className="flex w-full items-center justify-between gap-2 overflow-hidden border-b px-2 py-3 align-middle">
<div className="flex min-w-0 items-center gap-2">
{isEditing ? (
<div className="flex items-center">
<Input
ref={inputRef}
value={editedSession}
onKeyDown={(e) => {
if (e.key === "Enter") {
e.preventDefault();
e.stopPropagation();
handleConfirm();
}
}}
onChange={handleInputChange}
onBlur={(e) => {
console.log(e.relatedTarget);
if (
!e.relatedTarget ||
e.relatedTarget.getAttribute("data-confirm") !== "true"
) {
handleCancel();
}
}}
autoFocus
className="h-6 flex-grow px-1 py-0"
/>
<button
onClick={handleCancel}
className="hover:text-status-red-hover ml-2 text-status-red"
>
<IconComponent name="X" className="h-4 w-4" />
</button>
<button
onClick={handleConfirm}
data-confirm="true"
className="ml-2 text-green-500 hover:text-green-600"
>
<IconComponent name="Check" className="h-4 w-4" />
</button>
</div>
) : (
<ShadTooltip styleClasses="z-50" content={session}>
<div>
<Badge
variant="gray"
size="md"
className="block cursor-pointer truncate"
>
{session === currentFlowId ? "Default Session" : session}
</Badge>
</div>
</ShadTooltip>
)}
</div>
<Select value={""} onValueChange={handleSelectChange}>
<SelectTrigger
onClick={(e) => {
e.stopPropagation();
}}
onFocusCapture={() => {
inputRef.current?.focus();
}}
data-confirm="true"
className="h-8 w-8 border-none bg-transparent p-0 focus:ring-0"
>
<IconComponent name="MoreHorizontal" className="h-4 w-4" />
</SelectTrigger>
<SelectContent side="right" align="start" className="w-40 p-0">
<SelectItem
value="rename"
className="cursor-pointer px-3 py-2 focus:bg-muted"
>
<div className="flex items-center">
<IconComponent name="Pencil" className="mr-2 h-4 w-4" />
Rename
</div>
</SelectItem>
<SelectItem
value="messageLogs"
className="cursor-pointer px-3 py-2 focus:bg-muted"
>
<div className="flex w-full items-center justify-between">
<div className="flex items-center">
<IconComponent name="ScrollText" className="mr-2 h-4 w-4" />
Message logs
</div>
<IconComponent
name="ExternalLink"
className="absolute right-2 h-4 w-4"
/>
</div>
</SelectItem>
<SelectItem
value="delete"
className="cursor-pointer px-3 py-2 focus:bg-muted"
>
<div className="flex items-center text-status-red hover:text-status-red">
<IconComponent name="Trash2" className="mr-2 h-4 w-4" />
Delete
</div>
</SelectItem>
</SelectContent>
</Select>
</div>
</div>
);
}

View file

@ -10,7 +10,10 @@ import { useMemo, useState } from "react";
import TableComponent from "../../../../components/tableComponent";
import useAlertStore from "../../../../stores/alertStore";
import { useMessagesStore } from "../../../../stores/messagesStore";
import { messagesSorter } from "../../../../utils/utils";
import {
extractColumnsFromRows,
messagesSorter,
} from "../../../../utils/utils";
export default function SessionView({
session,
@ -19,12 +22,12 @@ export default function SessionView({
session?: string;
id?: string;
}) {
const columns = useMessagesStore((state) => state.columns);
const messages = useMessagesStore((state) => state.messages);
const setErrorData = useAlertStore((state) => state.setErrorData);
const setSuccessData = useAlertStore((state) => state.setSuccessData);
const updateMessage = useMessagesStore((state) => state.updateMessage);
const deleteMessagesStore = useMessagesStore((state) => state.removeMessages);
const columns = extractColumnsFromRows(messages, "intersection");
const isFetching = useIsFetching({
queryKey: ["useGetMessagesQuery"],
exact: false,
@ -56,22 +59,25 @@ export default function SessionView({
...row,
[field]: newValue,
};
updateMessageMutation(data, {
onSuccess: () => {
updateMessage(data);
// Set success message
setSuccessData({
title: "Messages updated successfully.",
});
updateMessageMutation(
{ message: data },
{
onSuccess: () => {
updateMessage(data);
// Set success message
setSuccessData({
title: "Messages updated successfully.",
});
},
onError: () => {
setErrorData({
title: "Error updating messages.",
});
event.data[field] = event.oldValue;
event.api.refreshCells();
},
},
onError: () => {
setErrorData({
title: "Error updating messages.",
});
event.data[field] = event.oldValue;
event.api.refreshCells();
},
});
);
}
const filteredMessages = useMemo(() => {

View file

@ -18,7 +18,7 @@ const UploadFileButton = ({
/>
<Button
disabled={lockChat}
className={`font-bold text-white transition-all ${
className={`font-bold transition-all dark:text-white ${
lockChat ? "cursor-not-allowed" : "hover:text-muted-foreground"
}`}
onClick={handleButtonClick}

View file

@ -0,0 +1,13 @@
import IconComponent from "@/components/genericIconComponent";
import { Button } from "@/components/ui/button";
import { ButtonHTMLAttributes } from "react";
export function EditMessageButton(
props: ButtonHTMLAttributes<HTMLButtonElement>,
) {
return (
<Button variant="ghost" size="icon" {...props}>
<IconComponent name="pencil" className="h-4 w-4" />
</Button>
);
}

View file

@ -0,0 +1,75 @@
import { Button } from "@/components/ui/button";
import { Textarea } from "@/components/ui/textarea";
import { useEffect, useRef, useState } from "react";
export default function EditMessageField({
message: initialMessage,
onEdit,
onCancel,
}: {
message: string;
onEdit: (message: string) => void;
onCancel: () => void;
}) {
const [message, setMessage] = useState(initialMessage);
const textareaRef = useRef<HTMLTextAreaElement>(null);
const [isButtonClicked, setIsButtonClicked] = useState(false);
const adjustTextareaHeight = () => {
if (textareaRef.current) {
textareaRef.current.style.height = "auto";
textareaRef.current.style.height = `${textareaRef.current.scrollHeight + 3}px`;
}
};
useEffect(() => {
adjustTextareaHeight();
}, []);
return (
<div className="flex h-fit w-full flex-col">
<Textarea
ref={textareaRef}
className="h-mx-full"
onBlur={() => {
if (!isButtonClicked) {
onCancel();
}
}}
value={message}
autoFocus={true}
onChange={(e) => setMessage(e.target.value)}
/>
<div className="flex w-full flex-row-reverse justify-between">
<div className="flex flex-row-reverse gap-2">
<Button
data-testid="save-button"
onMouseDown={() => setIsButtonClicked(true)}
onClick={() => {
onEdit(message);
setIsButtonClicked(false);
}}
className="btn btn-primary mt-2"
>
Save
</Button>
<Button
data-testid="cancel-button"
onMouseDown={() => setIsButtonClicked(true)}
onClick={() => {
onCancel();
setIsButtonClicked(false);
}}
className="btn btn-secondary mt-2"
>
Cancel
</Button>
</div>
<div>
<span className="mr-4 text-sm text-muted-foreground">
Editing messages will update the memory but won't restart the
conversation.
</span>
</div>
</div>
</div>
);
}

View file

@ -5,16 +5,25 @@ import formatFileName from "../../../filePreviewChat/utils/format-file-name";
export default function FileCardWrapper({
index,
name,
type,
path,
}: {
index: number;
name: string;
type: string;
path: string;
path: { path: string; type: string; name: string } | string;
}) {
const [show, setShow] = useState<boolean>(true);
let name: string = "";
let type: string = "";
let pathString: string = "";
if (typeof path === "string") {
name = path.split("/").pop() || "";
type = path.split(".").pop() || "";
pathString = path;
} else {
name = path.name;
type = path.type;
pathString = path.path;
}
return (
<div key={index} className="flex flex-col gap-2">
<span
@ -24,7 +33,12 @@ export default function FileCardWrapper({
{formatFileName(name, 50)}
<ForwardedIconComponent name={show ? "ChevronDown" : "ChevronRight"} />
</span>
<FileCard showFile={show} fileName={name} fileType={type} path={path} />
<FileCard
showFile={show}
fileName={name}
fileType={type}
path={pathString}
/>
</div>
);
}

View file

@ -1,10 +1,12 @@
import ShadTooltip from "@/components/shadTooltipComponent";
import { useUpdateMessage } from "@/controllers/API/queries/messages";
import useFlowsManagerStore from "@/stores/flowsManagerStore";
import { useUtilityStore } from "@/stores/utilityStore";
import Convert from "ansi-to-html";
import { useEffect, useMemo, useRef, useState } from "react";
import { useEffect, useRef, useState } from "react";
import Markdown from "react-markdown";
import rehypeMathjax from "rehype-mathjax";
import remarkGfm from "remark-gfm";
import remarkMath from "remark-math";
import MaleTechnology from "../../../../../assets/male-technologist.png";
import Robot from "../../../../../assets/robot.png";
import CodeTabsComponent from "../../../../../components/codeTabsComponent";
@ -15,9 +17,10 @@ import {
EMPTY_OUTPUT_SEND_MESSAGE,
} from "../../../../../constants/constants";
import useAlertStore from "../../../../../stores/alertStore";
import useFlowStore from "../../../../../stores/flowStore";
import { chatMessagePropsType } from "../../../../../types/components";
import { classNames, cn } from "../../../../../utils/utils";
import { cn } from "../../../../../utils/utils";
import { EditMessageButton } from "./components/editMessageButton";
import EditMessageField from "./components/editMessageField";
import FileCardWrapper from "./components/fileCardWrapper";
export default function ChatMessage({
@ -27,21 +30,27 @@ export default function ChatMessage({
updateChat,
setLockChat,
}: chatMessagePropsType): JSX.Element {
const [showFile, setShowFile] = useState<boolean>(true);
const convert = new Convert({ newline: true });
const [hidden, setHidden] = useState(true);
const template = chat.template;
const [promptOpen, setPromptOpen] = useState(false);
const [streamUrl, setStreamUrl] = useState(chat.stream_url);
const flow_id = useFlowsManagerStore((state) => state.currentFlowId);
// We need to check if message is not undefined because
// we need to run .toString() on it
const chatMessageString = chat.message ? chat.message.toString() : "";
const [chatMessage, setChatMessage] = useState(chatMessageString);
const [chatMessage, setChatMessage] = useState(
chat.message ? chat.message.toString() : "",
);
const [isStreaming, setIsStreaming] = useState(false);
const eventSource = useRef<EventSource | undefined>(undefined);
const setErrorData = useAlertStore((state) => state.setErrorData);
const chatMessageRef = useRef(chatMessage);
const [editMessage, setEditMessage] = useState(false);
useEffect(() => {
const chatMessageString = chat.message ? chat.message.toString() : "";
setChatMessage(chatMessageString);
}, [chat]);
const playgroundScrollBehaves = useUtilityStore(
(state) => state.playgroundScrollBehaves,
);
@ -133,10 +142,68 @@ export default function ChatMessage({
console.error(e);
}
const isEmpty = decodedMessage?.trim() === "";
const { mutate: updateMessageMutation } = useUpdateMessage();
const convertFiles = (
files:
| (
| string
| {
path: string;
type: string;
name: string;
}
)[]
| undefined,
) => {
if (!files) return [];
return files.map((file) => {
if (typeof file === "string") {
return file;
}
return file.path;
});
};
const handleEditMessage = (message: string) => {
updateMessageMutation(
{
message: {
...chat,
files: convertFiles(chat.files),
sender_name: chat.sender_name ?? "AI",
text: message,
sender: chat.isSend ? "User" : "Machine",
flow_id,
session_id: chat.session ?? "",
},
refetch: true,
},
{
onSuccess: () => {
updateChat(chat, message);
setEditMessage(false);
},
onError: () => {
setErrorData({
title: "Error updating messages.",
});
},
},
);
};
const editedFlag = chat.edit ? (
<span className="text-sm text-chat-trigger-disabled">(Edited)</span>
) : null;
return (
<>
<div className={cn("form-modal-chat-position", chat.isSend ? "" : " ")}>
<div
className={cn(
"form-modal-chat-position group hover:bg-background",
chat.isSend ? "" : " ",
)}
>
<div
className={
"mr-3 mt-1 flex w-24 flex-col items-center gap-1 overflow-hidden px-3 pb-3"
@ -198,87 +265,107 @@ export default function ChatMessage({
}
className="flex w-full flex-col"
>
{useMemo(
() =>
isEmpty && lockChat ? (
<IconComponent
name="MoreHorizontal"
className="h-8 w-8 animate-pulse"
{chatMessage === "" && lockChat ? (
<IconComponent
name="MoreHorizontal"
className="h-8 w-8 animate-pulse"
/>
) : (
<div onDoubleClick={() => setEditMessage(true)}>
{editMessage ? (
<EditMessageField
key={`edit-message-${chat.id}`}
message={decodedMessage}
onEdit={(message) => {
handleEditMessage(message);
}}
onCancel={() => setEditMessage(false)}
/>
) : (
<Markdown
remarkPlugins={[remarkGfm]}
linkTarget="_blank"
rehypePlugins={[rehypeMathjax]}
className={cn(
"markdown prose flex flex-col word-break-break-word dark:prose-invert",
isEmpty
? "text-chat-trigger-disabled"
: "text-primary",
)}
components={{
pre({ node, ...props }) {
return <>{props.children}</>;
},
code: ({
node,
inline,
className,
children,
...props
}) => {
if (typeof children === "string") {
if ((children as string)!.length) {
if (children![0] === "▍") {
return (
<span className="form-modal-markdown-span">
</span>
<>
<div className="flex gap-2">
<Markdown
remarkPlugins={[remarkGfm]}
linkTarget="_blank"
rehypePlugins={[rehypeMathjax]}
className={cn(
"markdown prose flex flex-col word-break-break-word dark:prose-invert",
isEmpty
? "text-chat-trigger-disabled"
: "text-primary",
)}
components={{
pre({ node, ...props }) {
return <>{props.children}</>;
},
code: ({
node,
inline,
className,
children,
...props
}) => {
let content = children as string;
if (
Array.isArray(children) &&
children.length === 1 &&
typeof children[0] === "string"
) {
content = children[0] as string;
}
if (typeof content === "string") {
if (content.length) {
if (content[0] === "▍") {
return (
<span className="form-modal-markdown-span">
</span>
);
}
}
const match = /language-(\w+)/.exec(
className || "",
);
return !inline ? (
<CodeTabsComponent
isMessage
tabs={[
{
name: (match && match[1]) || "",
mode: (match && match[1]) || "",
image:
"https://curl.se/logo/curl-symbol-transparent.png",
language:
(match && match[1]) || "",
code: String(content).replace(
/\n$/,
"",
),
},
]}
activeTab={"0"}
setActiveTab={() => {}}
/>
) : (
<code className={className} {...props}>
{content}
</code>
);
}
children![0] = (
children![0] as string
).replace("`▍`", "▍");
}
}
const match = /language-(\w+)/.exec(
className || "",
);
return !inline ? (
<CodeTabsComponent
isMessage
tabs={[
{
name: (match && match[1]) || "",
mode: (match && match[1]) || "",
image:
"https://curl.se/logo/curl-symbol-transparent.png",
language: (match && match[1]) || "",
code: String(children).replace(
/\n$/,
"",
),
},
]}
activeTab={"0"}
setActiveTab={() => {}}
/>
) : (
<code className={className} {...props}>
{children}
</code>
);
},
}}
>
{isEmpty && !chat.stream_url
? EMPTY_OUTPUT_SEND_MESSAGE
: chatMessage}
</Markdown>
),
[chat.message, chatMessage],
},
}}
>
{isEmpty && !chat.stream_url
? EMPTY_OUTPUT_SEND_MESSAGE
: chatMessage}
</Markdown>
</div>
{editedFlag}
</>
)}
</div>
)}
</div>
</div>
@ -343,25 +430,35 @@ export default function ChatMessage({
</>
) : (
<div className="flex flex-col">
<div
className={`whitespace-pre-wrap break-words ${
isEmpty ? "text-chat-trigger-disabled" : "text-primary"
}`}
data-testid={`chat-message-${chat.sender_name}-${chatMessage}`}
>
{isEmpty ? EMPTY_INPUT_SEND_MESSAGE : decodedMessage}
</div>
{editMessage ? (
<EditMessageField
key={`edit-message-${chat.id}`}
message={decodedMessage}
onEdit={(message) => {
handleEditMessage(message);
}}
onCancel={() => setEditMessage(false)}
/>
) : (
<>
<div
onDoubleClick={() => {
setEditMessage(true);
}}
className={`flex gap-2 whitespace-pre-wrap break-words ${
isEmpty ? "text-chat-trigger-disabled" : "text-primary"
}`}
data-testid={`chat-message-${chat.sender_name}-${chatMessage}`}
>
{isEmpty ? EMPTY_INPUT_SEND_MESSAGE : decodedMessage}
</div>
{editedFlag}
</>
)}
{chat.files && (
<div className="my-2 flex flex-col gap-5">
{chat.files.map((file, index) => {
return (
<FileCardWrapper
index={index}
name={file.name}
type={file.type}
path={file.path}
/>
);
return <FileCardWrapper index={index} path={file} />;
})}
</div>
)}
@ -369,6 +466,16 @@ export default function ChatMessage({
)}
</div>
)}
{!editMessage && (
<ShadTooltip content="Edit Message" styleClasses="z-50">
<div>
<EditMessageButton
className="invisible h-fit group-hover:visible"
onClick={() => setEditMessage(true)}
/>
</div>
</ShadTooltip>
)}
</div>
<div id={lastMessage ? "last-chat-message" : undefined} />
</>

View file

@ -2,11 +2,11 @@ import { INVALID_FILE_SIZE_ALERT } from "@/constants/alerts_constants";
import { useDeleteBuilds } from "@/controllers/API/queries/_builds";
import { usePostUploadFile } from "@/controllers/API/queries/files/use-post-upload-file";
import { track } from "@/customization/utils/analytics";
import { useMessagesStore } from "@/stores/messagesStore";
import { useUtilityStore } from "@/stores/utilityStore";
import { useEffect, useRef, useState } from "react";
import ShortUniqueId from "short-unique-id";
import IconComponent from "../../../../components/genericIconComponent";
import { Button } from "../../../../components/ui/button";
import {
ALLOWED_IMAGE_INPUT_EXTENSIONS,
CHAT_FIRST_INITIAL_TEXT,
@ -17,10 +17,8 @@ import {
import useAlertStore from "../../../../stores/alertStore";
import useFlowStore from "../../../../stores/flowStore";
import useFlowsManagerStore from "../../../../stores/flowsManagerStore";
import { VertexBuildTypeAPI } from "../../../../types/api";
import { ChatMessageType } from "../../../../types/chat";
import { FilePreviewType, chatViewProps } from "../../../../types/components";
import { classNames } from "../../../../utils/utils";
import ChatInput from "./chatInput";
import useDragAndDrop from "./chatInput/hooks/use-drag-and-drop";
import ChatMessage from "./chatMessage";
@ -31,16 +29,17 @@ export default function ChatView({
setChatValue,
lockChat,
setLockChat,
visibleSession,
focusChat,
}: chatViewProps): JSX.Element {
const { flowPool, outputs, inputs, CleanFlowPool } = useFlowStore();
const { setErrorData } = useAlertStore();
const currentFlowId = useFlowsManagerStore((state) => state.currentFlowId);
const messagesRef = useRef<HTMLDivElement | null>(null);
const [chatHistory, setChatHistory] = useState<ChatMessageType[]>([]);
const messages = useMessagesStore((state) => state.messages);
const inputTypes = inputs.map((obj) => obj.type);
const inputIds = inputs.map((obj) => obj.id);
const outputIds = outputs.map((obj) => obj.id);
const updateFlowPool = useFlowStore((state) => state.updateFlowPool);
const [id, setId] = useState<string>("");
const { mutate: mutateDeleteFlowPool } = useDeleteBuilds();
@ -48,62 +47,35 @@ export default function ChatView({
//build chat history
useEffect(() => {
const chatOutputResponses: VertexBuildTypeAPI[] = [];
outputIds.forEach((outputId) => {
if (outputId.includes("ChatOutput")) {
if (flowPool[outputId] && flowPool[outputId].length > 0) {
chatOutputResponses.push(...flowPool[outputId]);
}
}
});
inputIds.forEach((inputId) => {
if (inputId.includes("ChatInput")) {
if (flowPool[inputId] && flowPool[inputId].length > 0) {
chatOutputResponses.push(...flowPool[inputId]);
}
}
});
const chatMessages: ChatMessageType[] = chatOutputResponses
.sort((a, b) => Date.parse(a.timestamp) - Date.parse(b.timestamp))
//
const messagesFromMessagesStore: ChatMessageType[] = messages
.filter(
(output) =>
output.data.message || (!output.data.message && output.artifacts),
(message) =>
message.flow_id === currentFlowId &&
(visibleSession === message.session_id ?? true),
)
.map((output, index) => {
try {
const messageOutput = output.data.message;
const hasMessageValue =
messageOutput?.message ||
messageOutput?.message === "" ||
(messageOutput?.files ?? []).length > 0 ||
messageOutput?.stream_url;
const { sender, message, sender_name, stream_url, files } =
hasMessageValue ? output.data.message : output.artifacts;
const is_ai =
sender === "Machine" || sender === null || sender === undefined;
return {
isSend: !is_ai,
message,
sender_name,
componentId: output.id,
stream_url: stream_url,
files,
};
} catch (e) {
console.error(e);
return {
isSend: false,
message: "Error parsing message",
sender_name: "Error",
componentId: output.id,
};
.map((message) => {
let files = message.files;
//HANDLE THE "[]" case
if (typeof files === "string") {
files = JSON.parse(files);
}
return {
isSend: message.sender === "User",
message: message.text,
sender_name: message.sender_name,
files: files,
id: message.id,
timestamp: message.timestamp,
session: message.session_id,
edit: message.edit,
};
});
setChatHistory(chatMessages);
}, [flowPool]);
const finalChatHistory = [...messagesFromMessagesStore].sort((a, b) => {
return new Date(a.timestamp).getTime() - new Date(b.timestamp).getTime();
});
setChatHistory(finalChatHistory);
}, [flowPool, messages, visibleSession]);
useEffect(() => {
if (messagesRef.current) {
messagesRef.current.scrollTop = messagesRef.current.scrollHeight;
@ -116,7 +88,8 @@ export default function ChatView({
if (ref.current) {
ref.current.focus();
}
}, []);
// trigger focus on chat when new session is set
}, [focusChat]);
function clearChat(): void {
setChatHistory([]);
@ -150,11 +123,12 @@ export default function ChatView({
stream_url?: string,
) {
chat.message = message;
updateFlowPool(chat.componentId, {
message,
sender_name: chat.sender_name ?? "Bot",
sender: chat.isSend ? "User" : "Machine",
});
if (chat.componentId)
updateFlowPool(chat.componentId, {
message,
sender_name: chat.sender_name ?? "Bot",
sender: chat.isSend ? "User" : "Machine",
});
}
const [files, setFiles] = useState<FilePreviewType[]>([]);
const [isDragging, setIsDragging] = useState(false);
@ -245,20 +219,6 @@ export default function ChatView({
onDrop={onDrop}
>
<div className="eraser-size">
<div className="eraser-position">
<Button
className="flex gap-1"
unstyled
disabled={lockChat}
onClick={() => handleSelectChange("builds")}
>
<IconComponent
name="Eraser"
className={classNames("h-5 w-5 text-primary")}
aria-hidden="true"
/>
</Button>
</div>
<div ref={messagesRef} className="chat-message-div">
{chatHistory?.length > 0 ? (
chatHistory.map((chat, index) => (
@ -267,7 +227,7 @@ export default function ChatView({
lockChat={lockChat}
chat={chat}
lastMessage={chatHistory.length - 1 === index ? true : false}
key={`${chat.componentId}-${index}`}
key={`${chat.id}-${index}`}
updateChat={updateChat}
/>
))

View file

@ -3,7 +3,9 @@ import {
useGetMessagesQuery,
} from "@/controllers/API/queries/messages";
import { useUtilityStore } from "@/stores/utilityStore";
import { someFlowTemplateFields } from "@/utils/reactflowUtils";
import { useEffect, useState } from "react";
import ShortUniqueId from "short-unique-id";
import AccordionComponent from "../../components/accordionComponent";
import IconComponent from "../../components/genericIconComponent";
import ShadTooltip from "../../components/shadTooltipComponent";
@ -26,6 +28,7 @@ import { NodeType } from "../../types/flow";
import { cn } from "../../utils/utils";
import BaseModal from "../baseModal";
import IOFieldView from "./components/IOFieldView";
import SessionSelector from "./components/IOFieldView/components/sessionSelector";
import SessionView from "./components/SessionView";
import ChatView from "./components/chatView";
@ -59,10 +62,15 @@ export default function IOModal({
inputs.length > 0 ? 1 : outputs.length > 0 ? 2 : 0,
);
const setErrorData = useAlertStore((state) => state.setErrorData);
const setNoticeData = useAlertStore((state) => state.setNoticeData);
const setSuccessData = useAlertStore((state) => state.setSuccessData);
const deleteSession = useMessagesStore((state) => state.deleteSession);
const currentFlowId = useFlowsManagerStore((state) => state.currentFlowId);
const { mutate: deleteSessionFunction } = useDeleteMessages();
const [visibleSession, setvisibleSession] = useState<string | undefined>(
currentFlowId,
);
function handleDeleteSession(session_id: string) {
deleteSessionFunction(
@ -77,6 +85,9 @@ export default function IOModal({
title: "Session deleted successfully.",
});
deleteSession(session_id);
if (visibleSession === session_id) {
setvisibleSession(undefined);
}
},
onError: () => {
setErrorData({
@ -109,13 +120,20 @@ export default function IOModal({
const setLockChat = useFlowStore((state) => state.setLockChat);
const [chatValue, setChatValue] = useState("");
const isBuilding = useFlowStore((state) => state.isBuilding);
const currentFlowId = useFlowsManagerStore((state) => state.currentFlowId);
const setNode = useFlowStore((state) => state.setNode);
const [sessions, setSessions] = useState<string[]>([]);
const messages = useMessagesStore((state) => state.messages);
const [sessions, setSessions] = useState<string[]>(
Array.from(
new Set(
messages
.filter((message) => message.flow_id === currentFlowId)
.map((message) => message.session_id),
),
),
);
const flowPool = useFlowStore((state) => state.flowPool);
const { refetch } = useGetMessagesQuery(
const [sessionId, setSessionId] = useState<string>(currentFlowId);
useGetMessagesQuery(
{
mode: "union",
id: currentFlowId,
@ -140,13 +158,14 @@ export default function IOModal({
startNodeId: chatInput?.id,
files: files,
silent: true,
session: sessionId,
setLockChat,
}).catch((err) => {
console.error(err);
setLockChat(false);
});
}
refetch();
// refetch();
setLockChat(false);
if (chatInput) {
setNode(chatInput.id, (node: NodeType) => {
@ -169,10 +188,27 @@ export default function IOModal({
.forEach((row) => {
sessions.add(row.session_id);
});
setSessions(Array.from(sessions));
sessions;
setSessions((prev) => {
if (prev.length < Array.from(sessions).length) {
// set the new session as visible
setvisibleSession(
Array.from(sessions)[Array.from(sessions).length - 1],
);
}
return Array.from(sessions);
});
}, [messages]);
useEffect(() => {
if (!visibleSession) {
setSessionId(
`Session ${new Date().toLocaleString("en-US", { day: "2-digit", month: "short", hour: "2-digit", minute: "2-digit", hour12: true, second: "2-digit" })}`,
);
} else if (visibleSession) {
setSessionId(visibleSession);
}
}, [visibleSession]);
const setPlaygroundScrollBehaves = useUtilityStore(
(state) => state.setPlaygroundScrollBehaves,
);
@ -229,9 +265,7 @@ export default function IOModal({
{outputs.length > 0 && (
<TabsTrigger value={"2"}>Outputs</TabsTrigger>
)}
{haveChat && (
<TabsTrigger value={"0"}>Memories</TabsTrigger>
)}
{haveChat && <TabsTrigger value={"0"}>Chat</TabsTrigger>}
</TabsList>
</div>
@ -366,92 +400,49 @@ export default function IOModal({
})}
</TabsContent>
<TabsContent value={"0"} className="api-modal-tabs-content">
{sessions.map((session, index) => {
return (
<div
key={index}
className="file-component-accordion-div cursor-pointer"
onClick={(event) => {
event.stopPropagation();
setSelectedViewField({
id: session,
type: "Session",
});
}}
>
<div className="flex w-full items-center justify-between gap-2 overflow-hidden border-b px-2 py-3.5 align-middle">
<ShadTooltip styleClasses="z-50" content={session}>
<div className="flex min-w-0">
<Badge
variant="gray"
size="md"
className="block truncate"
>
{session === currentFlowId
? "Default Session"
: session}
</Badge>
</div>
</ShadTooltip>
<div className="flex shrink-0 items-center justify-center gap-2 align-middle">
<Button
unstyled
size="icon"
onClick={(e) => {
e.preventDefault();
e.stopPropagation();
handleDeleteSession(session);
if (selectedViewField?.id === session)
setSelectedViewField(undefined);
}}
>
<ShadTooltip
styleClasses="z-50"
content={"Delete"}
>
<div>
<IconComponent
name="Trash2"
className="h-4 w-4"
></IconComponent>
</div>
</ShadTooltip>
</Button>
{/* <div>
<ShadTooltip
styleClasses="z-50"
content={
flow_sessions.some(
(f_session) =>
f_session?.session_id === session,
)
? "Active Session"
: "Inactive Session"
}
>
<div
className={cn(
"h-2 w-2 rounded-full",
flow_sessions.some(
(f_session) =>
f_session?.session_id === session,
)
? "bg-status-green"
: "bg-slate-500",
)}
></div>
</ShadTooltip>
</div> */}
</div>
</div>
</div>
);
})}
{sessions.map((session, index) => (
<SessionSelector
setSelectedView={setSelectedViewField}
selectedView={selectedViewField}
key={index}
session={session}
deleteSession={(session) => {
handleDeleteSession(session);
if (selectedViewField?.id === session) {
setSelectedViewField(undefined);
}
}}
updateVisibleSession={(session) => {
setvisibleSession(session);
}}
toggleVisibility={() => {
setvisibleSession(session);
}}
isVisible={visibleSession === session}
inspectSession={(session) => {
setSelectedViewField({
id: session,
type: "Session",
});
}}
/>
))}
{!sessions.length && (
<span className="text-sm text-muted-foreground">
No memories available.
</span>
)}
{sessions.length > 0 && (
<div className="pt-6">
<Button
onClick={(_) => {
setvisibleSession(undefined);
}}
>
New Chat
</Button>
</div>
)}
</TabsContent>
</Tabs>
</div>
@ -517,11 +508,13 @@ export default function IOModal({
>
{haveChat ? (
<ChatView
focusChat={sessionId}
sendMessage={sendMessage}
chatValue={chatValue}
setChatValue={setChatValue}
lockChat={lockChat}
setLockChat={setLockChat}
visibleSession={visibleSession}
/>
) : (
<span className="flex h-full w-full items-center justify-center font-thin text-muted-foreground">

View file

@ -522,6 +522,7 @@ const useFlowStore = create<FlowStoreType>((set, get) => ({
files,
silent,
setLockChat,
session,
}: {
startNodeId?: string;
stopNodeId?: string;
@ -529,6 +530,7 @@ const useFlowStore = create<FlowStoreType>((set, get) => ({
files?: string[];
silent?: boolean;
setLockChat?: (lock: boolean) => void;
session?: string;
}) => {
get().setIsBuilding(true);
get().setLockChat(true);
@ -633,6 +635,7 @@ const useFlowStore = create<FlowStoreType>((set, get) => ({
useFlowStore.getState().updateBuildStatus([vertexBuildData.id], status);
}
await buildFlowVerticesWithFallback({
session,
input_value,
files,
flowId: currentFlow!.id,

View file

@ -10,10 +10,6 @@ export const useMessagesStore = create<MessagesStoreType>((set, get) => ({
return { messages: updatedMessages };
});
},
columns: [],
setColumns: (columns) => {
set(() => ({ columns: columns }));
},
messages: [],
setMessages: (messages) => {
set(() => ({ messages: messages }));
@ -33,6 +29,20 @@ export const useMessagesStore = create<MessagesStoreType>((set, get) => ({
),
}));
},
updateMessagePartial: (message) => {
// search for the message and update it
// look for the message list backwards to find the message faster
set((state) => {
const updatedMessages = [...state.messages];
for (let i = get().messages.length - 1; i >= 0; i--) {
if (get().messages[i].id === message.id) {
updatedMessages[i] = { ...updatedMessages[i], ...message };
break;
}
}
return { messages: updatedMessages };
});
},
clearMessages: () => {
set(() => ({ messages: [] }));
},

View file

@ -6,12 +6,16 @@ export type ChatMessageType = {
template?: string;
isSend: boolean;
thought?: string;
files?: Array<{ path: string; type: string; name: string }>;
files?: Array<{ path: string; type: string; name: string } | string>;
prompt?: string;
chatKey?: string;
componentId: string;
componentId?: string;
id: string;
timestamp: string;
stream_url?: string | null;
sender_name?: string;
session?: string;
edit?: boolean;
};
export type ChatOutputType = {

View file

@ -719,6 +719,8 @@ export type chatViewProps = {
setChatValue: (value: string) => void;
lockChat: boolean;
setLockChat: (lock: boolean) => void;
visibleSession?: string;
focusChat?: string;
};
export type IOFileInputProps = {

View file

@ -1,13 +1,13 @@
type Message = {
artifacts: Record<string, any>;
flow_id: string;
message: string;
text: string;
sender: string;
sender_name: string;
session_id: string;
timestamp: string;
files: Array<string>;
id: string;
edit: boolean;
};
export type { Message };

View file

@ -130,6 +130,7 @@ export type FlowStoreType = {
files,
silent,
setLockChat,
session,
}: {
setLockChat?: (lock: boolean) => void;
startNodeId?: string;
@ -137,6 +138,7 @@ export type FlowStoreType = {
input_value?: string;
files?: string[];
silent?: boolean;
session?: string;
}) => Promise<void>;
getFlow: () => { nodes: Node[]; edges: Edge[]; viewport: Viewport };
updateVerticesBuild: (

View file

@ -7,9 +7,8 @@ export type MessagesStoreType = {
addMessage: (message: Message) => void;
removeMessage: (message: Message) => void;
updateMessage: (message: Message) => void;
updateMessagePartial: (message: Partial<Message>) => void;
clearMessages: () => void;
removeMessages: (ids: string[]) => void;
columns: Array<ColDef | ColGroupDef>;
setColumns: (columns: Array<ColDef | ColGroupDef>) => void;
deleteSession: (id: string) => void;
};

View file

@ -1,6 +1,8 @@
import { BASE_URL_API } from "@/constants/constants";
import { performStreamingRequest } from "@/controllers/API/api";
import { useMessagesStore } from "@/stores/messagesStore";
import { AxiosError } from "axios";
import { timeStamp } from "console";
import { Edge, Node } from "reactflow";
import { BuildStatus } from "../constants/enums";
import { getVerticesOrder, postBuildVertex } from "../controllers/API";
@ -32,6 +34,7 @@ type BuildVerticesParams = {
nodes?: Node[];
edges?: Edge[];
logBuilds?: boolean;
session?: string;
};
function getInactiveVertexData(vertexId: string): VertexBuildTypeAPI {
@ -152,7 +155,9 @@ export async function buildFlowVertices({
edges,
logBuilds,
setLockChat,
session,
}: BuildVerticesParams) {
const inputs = {};
let url = `${BASE_URL_API}build/${flowId}/flow?`;
if (startNodeId) {
url = `${url}&start_component_id=${startNodeId}`;
@ -164,9 +169,6 @@ export async function buildFlowVertices({
url = `${url}&log_builds=${logBuilds}`;
}
const postData = {};
if (typeof input_value !== "undefined") {
postData["inputs"] = { input_value: input_value };
}
if (files) {
postData["files"] = files;
}
@ -176,6 +178,15 @@ export async function buildFlowVertices({
edges,
};
}
if (typeof input_value !== "undefined") {
inputs["input_value"] = input_value;
}
if (session) {
inputs["session"] = session;
}
if (Object.keys(inputs).length > 0) {
postData["inputs"] = inputs;
}
const buildResults: Array<boolean> = [];
@ -188,6 +199,8 @@ export async function buildFlowVertices({
onBuildStart(ids.map((id) => ({ id: id, reference: id })));
ids.forEach((id) => verticesStartTimeMs.set(id, Date.now()));
};
console.log("type", type);
console.log("data", data);
switch (type) {
case "vertices_sorted": {
const verticesToRun = data.to_run;
@ -266,6 +279,19 @@ export async function buildFlowVertices({
}
return true;
}
case "message": {
//adds a message to the messsage table
useMessagesStore.getState().addMessage(data);
return true;
}
case "token": {
// await one milisencond so we avoid react batched updates
await new Promise((resolve) => {
useMessagesStore.getState().updateMessagePartial(data);
setTimeout(resolve, 10);
});
return true;
}
case "end": {
const allNodesValid = buildResults.every((result) => result);
onBuildComplete!(allNodesValid);

View file

@ -1719,3 +1719,14 @@ export function checkOldComponents({ nodes }: { nodes: any[] }) {
),
);
}
export function someFlowTemplateFields(
{ nodes }: { nodes: NodeType[] },
validateFn: (field: InputFieldType) => boolean,
): boolean {
return nodes.some((node) => {
return Object.keys(node.data.node?.template ?? {}).some((field) => {
return validateFn((node.data.node?.template ?? {})[field]);
});
});
}

View file

@ -633,6 +633,18 @@ export function addPlusSignes(array: string[]): string[] {
});
}
export function removeDuplicatesBasedOnAttribute<T>(
arr: T[],
attribute: string,
): T[] {
const seen = new Set();
const filteredChatHistory = arr.filter((item) => {
const duplicate = seen.has(item[attribute]);
seen.add(item[attribute]);
return !duplicate;
});
return filteredChatHistory;
}
export function isSupportedNodeTypes(type: string) {
return Object.keys(DRAG_EVENTS_CUSTOM_TYPESS).some((key) => key === type);
}

View file

@ -221,7 +221,7 @@ test("user must be able to freeze a component", async ({ page }) => {
await page.getByTestId("output-inspection-message").first().click();
await page.getByRole("gridcell").first().click();
await page.getByRole("gridcell").nth(4).click();
const firstRunWithoutFreezing = await page
.getByPlaceholder("Empty")
@ -244,7 +244,7 @@ test("user must be able to freeze a component", async ({ page }) => {
await page.getByTestId("output-inspection-message").first().click();
await page.getByRole("gridcell").first().click();
await page.getByRole("gridcell").nth(4).click();
const secondRunWithoutFreezing = await page
.getByPlaceholder("Empty")
@ -288,7 +288,7 @@ test("user must be able to freeze a component", async ({ page }) => {
await page.getByTestId("output-inspection-message").first().click();
await page.getByRole("gridcell").first().click();
await page.getByRole("gridcell").nth(4).click();
const firstTextFreezed = await page.getByPlaceholder("Empty").textContent();
@ -323,7 +323,7 @@ test("user must be able to freeze a component", async ({ page }) => {
await page.getByTestId("output-inspection-message").first().click();
await page.getByRole("gridcell").first().click();
await page.getByRole("gridcell").nth(4).click();
const thirdTextWithoutFreezing = await page
.getByPlaceholder("Empty")

View file

@ -0,0 +1,262 @@
import { expect, test } from "@playwright/test";
import * as dotenv from "dotenv";
import path from "path";
test("fresh start playground", async ({ page }) => {
if (!process.env.CI) {
dotenv.config({ path: path.resolve(__dirname, "../../.env") });
}
await page.goto("/");
await page.locator("span").filter({ hasText: "My Collection" }).isVisible();
await page.waitForSelector('[data-testid="mainpage_title"]', {
timeout: 30000,
});
await page.waitForSelector('[id="new-project-btn"]', {
timeout: 30000,
});
let modalCount = 0;
try {
const modalTitleElement = await page?.getByTestId("modal-title");
if (modalTitleElement) {
modalCount = await modalTitleElement.count();
}
} catch (error) {
modalCount = 0;
}
while (modalCount === 0) {
await page.getByText("New Project", { exact: true }).click();
await page.waitForTimeout(3000);
modalCount = await page.getByTestId("modal-title")?.count();
}
await page.waitForSelector('[data-testid="blank-flow"]', {
timeout: 30000,
});
await page.getByTestId("blank-flow").click();
await page.waitForSelector('[data-testid="extended-disclosure"]', {
timeout: 30000,
});
await page.getByTestId("extended-disclosure").click();
await page.getByPlaceholder("Search").click();
await page.getByPlaceholder("Search").fill("chat output");
await page.waitForTimeout(1000);
await page
.getByTestId("outputsChat Output")
.dragTo(page.locator('//*[@id="react-flow-id"]'));
await page.mouse.up();
await page.mouse.down();
await page.getByPlaceholder("Search").click();
await page.getByPlaceholder("Search").fill("chat input");
await page.waitForTimeout(1000);
await page
.getByTestId("inputsChat Input")
.dragTo(page.locator('//*[@id="react-flow-id"]'));
await page.mouse.up();
await page.mouse.down();
await page.waitForSelector('[title="fit view"]', {
timeout: 100000,
});
await page.getByTitle("fit view").click();
await page.getByTitle("zoom out").click();
await page.getByTitle("zoom out").click();
await page.getByTitle("zoom out").click();
await page.getByTitle("zoom out").click();
await page.getByTitle("zoom out").click();
await page.getByTitle("zoom out").click();
await page.getByTitle("zoom out").click();
const elementsChatInput = await page
.locator('[data-testid="handle-chatinput-shownode-message-right"]')
.all();
let visibleElementHandle;
for (const element of elementsChatInput) {
if (await element.isVisible()) {
visibleElementHandle = element;
break;
}
}
// Click and hold on the first element
await visibleElementHandle.hover();
await page.mouse.down();
// Move to the second element
const elementsChatOutput = await page
.getByTestId("handle-chatoutput-shownode-text-left")
.all();
for (const element of elementsChatOutput) {
if (await element.isVisible()) {
visibleElementHandle = element;
break;
}
}
await visibleElementHandle.hover();
// Release the mouse
await page.mouse.up();
await page.getByLabel("fit view").click();
await page.getByText("Playground", { exact: true }).last().click();
await page.waitForSelector('[data-testid="input-chat-playground"]', {
timeout: 100000,
});
//send message
await page.getByTestId("input-chat-playground").click();
await page.getByTestId("input-chat-playground").fill("message 1");
await page.keyboard.press("Enter");
//check message
await page.getByTestId("chat-message-User-message 1").click();
await page
.getByTestId("chat-message-AI-message 1")
.getByText("message")
.click();
//check session
await page.getByText("Default Session").click();
await page.getByTestId("chat-message-User-message 1").click();
//check edit message
await page.getByTestId("chat-message-User-message 1").hover();
await page
.locator("div")
.filter({ hasText: /^Usermessage 1$/ })
.getByTestId("icon-pencil")
.click();
await page.getByTestId("textarea").fill("edit_1");
await page.getByTestId("save-button").click();
await page.getByTestId("chat-message-User-edit_1").click();
await page.getByTestId("chat-message-User-edit_1").hover();
// check cancel edit
await page
.locator("div")
.filter({ hasText: /^Useredit_1$/ })
.getByTestId("icon-pencil")
.click();
await page.getByTestId("textarea").fill("cancel_edit");
await page.getByTestId("cancel-button").click();
await page.getByTestId("chat-message-User-edit_1").click();
await page.getByTestId("chat-message-User-edit_1").hover();
// check cancel edit blur
await page
.locator("div")
.filter({ hasText: /^Useredit_1$/ })
.getByTestId("icon-pencil")
.click();
await page.getByTestId("textarea").fill("cancel_edit_blur");
await page
.getByLabel("Playground")
.locator("div")
.filter({ hasText: "ChatDefault" })
.nth(2)
.click();
await page.getByTestId("chat-message-User-edit_1").click();
//check edit bot message
await page
.getByTestId("chat-message-AI-message 1")
.getByText("message")
.click();
await page.getByTestId("chat-message-AI-message 1").hover();
await page
.locator("div")
.filter({ hasText: /^AImessage 1$/ })
.getByTestId("icon-pencil")
.click();
await page.getByTestId("textarea").fill("edit_bot_1");
await page.getByTestId("save-button").click();
await page.getByText("edit_bot_1").click();
// check cancel edit bot
await page.getByTestId("chat-message-AI-edit_bot_1").hover();
await page
.locator("div")
.filter({ hasText: /^AIedit_bot_1$/ })
.getByTestId("icon-pencil")
.click();
await page.getByTestId("textarea").fill("edit_bot_cancel");
await page.getByTestId("cancel-button").click();
await page.getByText("edit_bot_1").click();
await page.getByTestId("chat-message-AI-edit_bot_1").hover();
// check cancel edit bot blur
await page
.locator("div")
.filter({ hasText: /^AIedit_bot_1$/ })
.getByTestId("icon-pencil")
.click();
await page.getByTestId("textarea").fill("edit_bot_blur_cancel");
await page
.getByLabel("Playground")
.locator("div")
.filter({ hasText: "ChatDefault" })
.nth(2)
.click();
await page.getByText("edit_bot_1").click();
// check table messages view
await page.getByRole("combobox").click();
await page.getByLabel("Message logs").click();
await page.getByText("Page 1 of 1", { exact: true }).click();
// check rename session
await page.getByRole("combobox").click();
await page.getByLabel("Rename").getByText("Rename").click();
await page.getByRole("textbox").fill("new name");
await page
.getByLabel("Chat", { exact: true })
.getByTestId("icon-Check")
.click();
await page.getByLabel("Chat", { exact: true }).getByText("new name").click();
// check cancel rename
await page.getByRole("combobox").click();
await page.getByLabel("Rename").getByText("Rename").click();
await page.getByRole("textbox").fill("cancel name");
await page.getByLabel("Chat", { exact: true }).getByTestId("icon-X").click();
await page.getByLabel("Chat", { exact: true }).getByText("new name").click();
// check cancel rename blur
await page.getByRole("combobox").click();
await page.getByLabel("Rename").getByText("Rename").click();
await page.getByRole("textbox").fill("cancel_blur");
await page.getByRole("tab", { name: "Chat" }).click();
await page.getByLabel("Chat", { exact: true }).getByText("new name").click();
// check delete session
await page.getByRole("combobox").click();
await page.getByLabel("Delete").click();
await page.getByText("No memories available.").click();
// check new session
await page.getByTestId("input-chat-playground").click();
await page.getByTestId("input-chat-playground").fill("session_after_delete");
await page.keyboard.press("Enter");
await page.getByTestId("chat-message-User-session_after_delete").click();
await expect(page.getByTestId("session-selector")).toBeVisible();
// check new chat
await page.getByRole("button", { name: "New Chat" }).click();
await page.waitForTimeout(3000);
await page.getByText("👋 Langflow Chat").click();
await page.getByTestId("input-chat-playground").click();
await page.getByTestId("input-chat-playground").fill("second session");
await page.keyboard.press("Enter");
await page.getByTestId("chat-message-User-second session").click();
await page
.getByTestId("chat-message-AI-second session")
.getByText("second session")
.click();
expect(await page.getByTestId("session-selector").count()).toBe(2);
const sessionElements = await page
.getByLabel("Playground")
.getByText(/^Session .+/)
.all();
expect(sessionElements.length).toBe(2);
});

View file

@ -109,7 +109,8 @@ test("Basic Prompting (Hello, World)", async ({ page }) => {
await page.getByText("files", { exact: true }).last().isVisible();
await page.getByRole("gridcell").last().isVisible();
await page.getByTestId("icon-Trash2").first().click();
await page.getByRole("combobox").click();
await page.getByLabel("Delete").click();
await page.waitForSelector('[data-testid="input-chat-playground"]', {
timeout: 100000,
});

View file

@ -108,8 +108,8 @@ test("Document QA", async ({ page }) => {
await page.getByText("files", { exact: true }).last().isVisible();
await page.getByRole("gridcell").last().isVisible();
await page.getByTestId("icon-Trash2").first().click();
await page.getByRole("combobox").click();
await page.getByLabel("Delete").click();
await page.waitForSelector('[data-testid="input-chat-playground"]', {
timeout: 100000,
});

View file

@ -116,7 +116,8 @@ test("Memory Chatbot", async ({ page }) => {
await page.getByText("files", { exact: true }).last().isVisible();
await page.getByRole("gridcell").last().isVisible();
await page.getByTestId("icon-Trash2").first().click();
await page.getByRole("combobox").click();
await page.getByLabel("Delete").click();
await page.waitForSelector('[data-testid="input-chat-playground"]', {
timeout: 100000,
});

View file

@ -182,8 +182,10 @@ test("Vector Store RAG", async ({ page }) => {
.last()
.isVisible();
await page.getByText("Memories", { exact: true }).last().click();
await page.getByText("Chat", { exact: true }).last().click();
await page.getByText("Default Session").last().click();
await page.getByRole("combobox").click();
await page.getByLabel("Message logs").click();
await page.getByText("timestamp", { exact: true }).last().isVisible();
await page.getByText("text", { exact: true }).last().isVisible();
@ -193,8 +195,8 @@ test("Vector Store RAG", async ({ page }) => {
await page.getByText("files", { exact: true }).last().isVisible();
await page.getByRole("gridcell").last().isVisible();
await page.getByTestId("icon-Trash2").first().click();
await page.getByRole("combobox").click();
await page.getByLabel("Delete").click();
await page.waitForSelector('[data-testid="input-chat-playground"]', {
timeout: 100000,
});

View file

@ -111,7 +111,7 @@ test("chat_io_teste", async ({ page }) => {
});
await page.getByTestId("input-chat-playground").click();
await page.getByTestId("input-chat-playground").fill("teste");
await page.getByRole("button").nth(1).click();
await page.getByTestId("icon-LucideSend").first().click();
const chat_output = page.getByTestId("chat-message-AI-teste");
const chat_input = page.getByTestId("chat-message-User-teste");
await expect(chat_output).toHaveText("teste");

View file

@ -53,15 +53,33 @@ test("user should be able to use duckduckgo search component", async ({
await page.getByTitle("fit view").click();
await page.waitForSelector("text=built successfully", { timeout: 30000 });
const result = await Promise.race([
page.waitForSelector("text=built successfully", { timeout: 30000 }),
page.waitForSelector("text=ratelimit", { timeout: 30000 }),
]);
await page.waitForTimeout(1000);
if (result) {
const isBuiltSuccessfully =
(await page.evaluate((el) => el.textContent, result))?.includes(
"built successfully",
) ?? false;
await page.getByTestId("output-inspection-data").first().click();
const isRateLimit =
(await page.evaluate((el) => el.textContent, result))?.includes(
"ratelimit",
) ?? false;
await page.getByRole("gridcell").first().click();
const searchResults = await page.getByPlaceholder("Empty").inputValue();
expect(searchResults.length).toBeGreaterThan(10);
expect(searchResults.toLowerCase()).toContain("langflow");
if (isBuiltSuccessfully) {
await page.waitForTimeout(1000);
await page.getByTestId("output-inspection-data").first().click();
await page.getByRole("gridcell").first().click();
const searchResults = await page.getByPlaceholder("Empty").inputValue();
expect(searchResults.length).toBeGreaterThan(10);
expect(searchResults.toLowerCase()).toContain("langflow");
} else if (isRateLimit) {
expect(true).toBeTruthy();
} else {
expect(true).toBeFalsy();
}
}
});

View file

@ -1,104 +0,0 @@
import { expect, test } from "@playwright/test";
import * as dotenv from "dotenv";
import path from "path";
test("erase button should clear the chat messages", async ({ page }) => {
test.skip(
!process?.env?.OPENAI_API_KEY,
"OPENAI_API_KEY required to run this test",
);
if (!process.env.CI) {
dotenv.config({ path: path.resolve(__dirname, "../../.env") });
}
await page.goto("/");
await page.waitForTimeout(1000);
let modalCount = 0;
try {
const modalTitleElement = await page?.getByTestId("modal-title");
if (modalTitleElement) {
modalCount = await modalTitleElement.count();
}
} catch (error) {
modalCount = 0;
}
while (modalCount === 0) {
await page.getByText("New Project", { exact: true }).click();
await page.waitForTimeout(3000);
modalCount = await page.getByTestId("modal-title")?.count();
}
await page.getByTestId("side_nav_options_all-templates").click();
await page.getByRole("heading", { name: "Basic Prompting" }).click();
await page.waitForTimeout(1000);
await page.getByTitle("fit view").click();
await page.getByTitle("zoom out").click();
await page.getByTitle("zoom out").click();
await page.getByTitle("zoom out").click();
let outdatedComponents = await page.getByTestId("icon-AlertTriangle").count();
while (outdatedComponents > 0) {
await page.getByTestId("icon-AlertTriangle").first().click();
await page.waitForTimeout(1000);
outdatedComponents = await page.getByTestId("icon-AlertTriangle").count();
}
await page
.getByTestId("popover-anchor-input-api_key")
.fill(process.env.OPENAI_API_KEY ?? "");
await page.getByTestId("dropdown_str_model_name").click();
await page.getByTestId("gpt-4o-1-option").click();
await page.waitForTimeout(1000);
await page.getByText("Playground", { exact: true }).last().click();
await page.waitForSelector('[data-testid="input-chat-playground"]', {
timeout: 100000,
});
await page.getByTestId("input-chat-playground").fill("Hello, how are you?");
await page.waitForSelector('[data-testid="icon-LucideSend"]', {
timeout: 100000,
});
await page.getByTestId("icon-LucideSend").click();
let valueUser = await page.getByTestId("sender_name_user").textContent();
await page.waitForSelector('[data-testid="sender_name_ai"]', {
timeout: 30000,
});
let valueAI = await page.getByTestId("sender_name_ai").textContent();
expect(valueUser).toBe("User");
expect(valueAI).toBe("AI");
await page.getByTestId("icon-Eraser").last().click();
await page.getByText("Hello, how are you?").isHidden();
await page.getByText("AI", { exact: true }).last().isHidden();
await page.getByText("User", { exact: true }).last().isHidden();
await page.getByText("Start a conversation").isVisible();
await page.getByText("Langflow Chat").isVisible();
await page.waitForTimeout(1000);
await page.getByPlaceholder("Send a message...").fill("My name is John");
await page.waitForSelector('[data-testid="icon-LucideSend"]', {
timeout: 100000,
});
await page.getByTestId("icon-LucideSend").click();
await page.waitForSelector("text=AI", { timeout: 30000 });
await page.getByText("Hello, how are you?").isHidden();
});

26
uv.lock generated
View file

@ -1477,15 +1477,15 @@ wheels = [
[[package]]
name = "duckduckgo-search"
version = "6.3.0"
version = "6.3.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "click" },
{ name = "primp" },
]
sdist = { url = "https://files.pythonhosted.org/packages/da/e8/575e4d1879e49ee52438f8435b663b391c3a7a6e7383133ddcdb0e519da7/duckduckgo_search-6.3.0.tar.gz", hash = "sha256:e9f56955569325a7d9cacda2488ca78bf6629a459e74415892bee560b664f5eb", size = 33045 }
sdist = { url = "https://files.pythonhosted.org/packages/73/bb/86be039796c7574ec2afe7a989de99c12155b9e8900a3da7c5aebaa63c81/duckduckgo_search-6.3.1.tar.gz", hash = "sha256:f43c7fa61518537bb5327aa9411520b04baa7c32b199d816b97d38f451f71824", size = 33037 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/5b/14/8a1c9a31046dd5b35c69e348ccdeb1c9a533294a2372abf958c0a9d30c37/duckduckgo_search-6.3.0-py3-none-any.whl", hash = "sha256:9a231a7b325226811cf7d35a240f3f501e718ae10a1aa0a638cabc80e129dfe7", size = 27455 },
{ url = "https://files.pythonhosted.org/packages/61/c0/7737a8abed252b2af8e500a536d6fe553d018b5435f1a8fd8ccb5d9e574e/duckduckgo_search-6.3.1-py3-none-any.whl", hash = "sha256:408fbfe07ae084eca5d0b5ebd0234187362d9e507ca1549f264104ce13006b58", size = 27451 },
]
[[package]]
@ -3645,7 +3645,7 @@ requires-dist = [
{ name = "couchbase", marker = "extra == 'couchbase'", specifier = ">=4.2.1" },
{ name = "ctransformers", marker = "extra == 'local'", specifier = ">=0.2.10" },
{ name = "dspy-ai", specifier = ">=2.4.0" },
{ name = "duckduckgo-search", specifier = ">=6.3.0" },
{ name = "duckduckgo-search", specifier = ">=6.3.1" },
{ name = "elasticsearch", specifier = ">=8.12.0" },
{ name = "faiss-cpu", specifier = ">=1.8.0" },
{ name = "fake-useragent", specifier = ">=1.5.0" },
@ -5509,17 +5509,17 @@ wheels = [
[[package]]
name = "primp"
version = "0.6.3"
version = "0.6.4"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/63/9c/10d2c7b734228021cf17d92b2872ed53535103d29a38a6ad4eee89a8ae1b/primp-0.6.3.tar.gz", hash = "sha256:17d30ebe26864defad5232dbbe1372e80483940012356e1f68846bb182282039", size = 78662 }
sdist = { url = "https://files.pythonhosted.org/packages/12/52/93b448d711f33319408a796a6ca92796589b522ebe6b4ec81e41ed49b0f0/primp-0.6.4.tar.gz", hash = "sha256:0a3de63e46a50664bcdc76e7aaf7060bf8443698efa902864669c5fca0d1abdd", size = 79073 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/61/09/96e8327fd8c1224226d9a170daf1dcba7d3f6578edeb9f811803f2a61aba/primp-0.6.3-cp38-abi3-macosx_10_12_x86_64.whl", hash = "sha256:bdbe6a7cdaaf5c9ed863432a941f4a75bd4c6ff626cbc8d32fc232793c70ba06", size = 2719331 },
{ url = "https://files.pythonhosted.org/packages/b3/a1/b58ac752b0500208df1be3b762eeaff1a117ec3108e26b38821ae9ac31e0/primp-0.6.3-cp38-abi3-macosx_11_0_arm64.whl", hash = "sha256:eeb53eb987bdcbcd85740633470255cab887d921df713ffa12a36a13366c9cdb", size = 2517811 },
{ url = "https://files.pythonhosted.org/packages/f7/85/ca027d7ec6121346d5927205777ad74bf918f291a538d710b5f9e0957333/primp-0.6.3-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:78da53d3c92a8e3f05bd3286ac76c291f1b6fe5e08ea63b7ba92b0f9141800bb", size = 2826729 },
{ url = "https://files.pythonhosted.org/packages/3a/45/ff51ccc5dbc82afa6d2dda15bf07b21b6b44708d02934cd7562007d1f719/primp-0.6.3-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:86337b44deecdac752bd8112909987fc9fa9b894f30191c80a164dc8f895da53", size = 2739692 },
{ url = "https://files.pythonhosted.org/packages/74/a1/a626fb8d2f6499d3e05971b2dcd33d770244722ee52f7d2c4ab636c1157f/primp-0.6.3-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:d3cd9a22b97f3eae42b2a5fb99f00480daf4cd6d9b139e05b0ffb03f7cc037f3", size = 2900573 },
{ url = "https://files.pythonhosted.org/packages/09/55/c96cb510c9f7881fa01bc7b269e446d32635ab9f0adbd36918c69b45a140/primp-0.6.3-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:7732bec917e2d3c48a31cdb92e1250f4ad6203a1aa4f802bd9abd84f2286a1e0", size = 3058818 },
{ url = "https://files.pythonhosted.org/packages/98/de/5c1dab24c1bff7933d0e4e8f4d0f46b66fc23531258d8433789c4468cac3/primp-0.6.3-cp38-abi3-win_amd64.whl", hash = "sha256:1e4113c34b86c676ae321af185f03a372caef3ee009f1682c2d62e30ec87348c", size = 2757631 },
{ url = "https://files.pythonhosted.org/packages/0a/ff/772fefb7ba0f6a33efe17be6eb4a7e5230d336c3ad44e80ae001510cb8a5/primp-0.6.4-cp38-abi3-macosx_10_12_x86_64.whl", hash = "sha256:e627330c1f2b723b523dc2e47caacbc5b5d0cd51ca11583b42fb8cde4da60d7d", size = 2860909 },
{ url = "https://files.pythonhosted.org/packages/b3/65/b5d1b580fc3c90853b65927438343634904e4229bc0397d3036c6ac8e120/primp-0.6.4-cp38-abi3-macosx_11_0_arm64.whl", hash = "sha256:e0cb7c05dd56c8b9741042fd568c0983fc19b0f3aa209a3940ecc04b4fd60314", size = 2665664 },
{ url = "https://files.pythonhosted.org/packages/41/27/f9e6eecd25fad9adc2784a95f2c4ec06ab630c1ddece7d2aeeb0252a4937/primp-0.6.4-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a4adc200ccb39e130c478d8b1a94f43a5b359068c6cb65b7c848812f96d96992", size = 2972386 },
{ url = "https://files.pythonhosted.org/packages/d1/a2/3cad4f1d58ca654d007f0a614d64cb18ac03dbb4a6d4446f9657a6a37261/primp-0.6.4-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:0ebae2d3aa36b04028e4accf2609d31d2e6981659e8e2effb09ee8ba960192e1", size = 2883232 },
{ url = "https://files.pythonhosted.org/packages/50/2c/ca6caa67b31a47591bddf74be047368a18efec9efc016266cad1e931c58e/primp-0.6.4-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:77f5fa5b34eaf251815622258419a484a2a9179dcbae2a1e702a254d91f613f1", size = 3044107 },
{ url = "https://files.pythonhosted.org/packages/16/44/320346afc08b2c646a46eb93046f84a6310365e22b70f2a642f1bf36c37a/primp-0.6.4-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:14cddf535cd2c4987412e90ca3ca35ae52cddbee6e0f0953d26b33a652a95692", size = 3205782 },
{ url = "https://files.pythonhosted.org/packages/dd/3b/90a1675b81b9f130345f28d4095179c5d79702673c03e5b804330aa875d6/primp-0.6.4-cp38-abi3-win_amd64.whl", hash = "sha256:96177ec2dadc47eaecbf0b22d2e93aeaf964a1be9a71e6e318d2ffb9e4242743", size = 2907474 },
]
[[package]]