Add support for running flows by endpoint name (#2012)

* feat: Add support for running flows by endpoint name

This commit modifies the `simplified_run_flow` endpoint in `endpoints.py` to allow running flows using the endpoint name instead of the flow ID. It introduces a new route parameter `flow_id_or_name` which can accept either a UUID or a string representing the endpoint name. The code first attempts to parse the parameter as a UUID, and if that fails, it queries the database to find a flow with the matching endpoint name. This change improves the usability of the API by providing an alternative way to identify flows for execution.

* feat: Add endpoint_name field to FlowType

This commit adds the `endpoint_name` field to the `FlowType` interface in the `index.ts` file. The `endpoint_name` field is an optional string that represents the name of the endpoint associated with the flow. This change allows for more flexibility in identifying flows by endpoint name instead of just the flow ID. It improves the usability of the codebase by providing an alternative way to reference flows.

* 🐛 (endpoints.py): change type of flow_id_or_name parameter from Union[str, UUID] to str to simplify the API and improve readability

* feat: Add migration utility functions for table, column, foreign key, and constraint existence checks

This commit adds utility functions to the `migration.py` module in the `langflow.utils` package. These functions provide convenient ways to check the existence of tables, columns, foreign keys, and constraints in a database using SQLAlchemy. The functions `table_exists`, `column_exists`, `foreign_key_exists`, and `constraint_exists` take the table name, column name, foreign key name, and constraint name respectively, along with the SQLAlchemy engine or connection object. They use the `Inspector` class from `sqlalchemy.engine.reflection` to retrieve the necessary information and return a boolean value indicating whether the specified element exists in the database. These utility functions improve the readability and maintainability of the codebase by encapsulating the common existence checks in a reusable and modular way.

* feat: Add unique constraints for per-user folders and flows

This commit adds unique constraints for per-user folders and flows in the database. It introduces the `unique_folder_name` constraint for the `folder` table, ensuring that each user can have only one folder with a specific name. Similarly, it adds the `unique_flow_endpoint_name` and `unique_flow_name` constraints for the `flow` table, enforcing uniqueness of endpoint names and flow names per user. These constraints improve data integrity and prevent duplicate entries in the database, providing a more robust and reliable system.

* feat: Add poetry installation and caching steps to GitHub Actions workflow

This commit updates the GitHub Actions workflow file `action.yml` to include additional steps for installing poetry and caching its dependencies. The `run` step now installs poetry using the specified version and ensures that the poetry binary is available in the PATH. Additionally, the workflow now includes a step to restore pip and poetry cached dependencies using the `actions/cache` action. These changes improve the workflow by providing a more efficient and reliable way to manage poetry dependencies and caching.

* refactor: Improve error handling in update_flow function

This commit improves the error handling in the `update_flow` function in `flows.py`. It adds a new `elif` condition to check if the exception is an instance of `HTTPException` and re-raises it. This ensures that any `HTTPException` raised during the update process is properly handled and returned as a response. Additionally, it removes the unnecessary `else` block and simplifies the code logic. This refactor enhances the reliability and maintainability of the `update_flow` function.
This commit is contained in:
Gabriel Luiz Freitas Almeida 2024-05-30 07:46:28 -07:00 committed by GitHub
commit 94e8bf1e2b
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
22 changed files with 504 additions and 136 deletions

View file

@ -77,7 +77,12 @@ runs:
POETRY_VERSION: ${{ inputs.poetry-version }}
PYTHON_VERSION: ${{ inputs.python-version }}
# Install poetry using the python version installed by setup-python step.
run: pipx install "poetry==$POETRY_VERSION" --python '${{ steps.setup-python.outputs.python-path }}' --verbose
run: |
pipx install "poetry==$POETRY_VERSION" --python '${{ steps.setup-python.outputs.python-path }}' --verbose
pipx ensurepath
# Ensure the poetry binary is available in the PATH.
# Test that the poetry binary is available.
poetry --version
- name: Restore pip and poetry cached dependencies
uses: actions/cache@v4

View file

@ -11,6 +11,7 @@ from alembic import op
import sqlalchemy as sa
import sqlmodel
from sqlalchemy.engine.reflection import Inspector
from langflow.utils import migration
${imports if imports else ""}
# revision identifiers, used by Alembic.
@ -22,13 +23,9 @@ depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
def upgrade() -> None:
conn = op.get_bind()
inspector = Inspector.from_engine(conn) # type: ignore
table_names = inspector.get_table_names()
${upgrades if upgrades else "pass"}
def downgrade() -> None:
conn = op.get_bind()
inspector = Inspector.from_engine(conn) # type: ignore
table_names = inspector.get_table_names()
${downgrades if downgrades else "pass"}

View file

@ -0,0 +1,42 @@
"""Add unique constraints per user in folder table
Revision ID: 1c79524817ed
Revises: 3bb0ddf32dfb
Create Date: 2024-05-29 23:12:09.146880
"""
from typing import Sequence, Union
from alembic import op
from sqlalchemy.engine.reflection import Inspector
# revision identifiers, used by Alembic.
revision: str = "1c79524817ed"
down_revision: Union[str, None] = "3bb0ddf32dfb"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
conn = op.get_bind()
inspector = Inspector.from_engine(conn) # type: ignore
constraints_names = [constraint["name"] for constraint in inspector.get_unique_constraints("folder")]
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table("folder", schema=None) as batch_op:
if "unique_folder_name" not in constraints_names:
batch_op.create_unique_constraint("unique_folder_name", ["user_id", "name"])
# ### end Alembic commands ###
def downgrade() -> None:
conn = op.get_bind()
inspector = Inspector.from_engine(conn) # type: ignore
constraints_names = [constraint["name"] for constraint in inspector.get_unique_constraints("folder")]
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table("folder", schema=None) as batch_op:
if "unique_folder_name" in constraints_names:
batch_op.drop_constraint("unique_folder_name", type_="unique")
# ### end Alembic commands ###

View file

@ -0,0 +1,54 @@
"""Add unique constraints per user in flow table
Revision ID: 3bb0ddf32dfb
Revises: a72f5cf9c2f9
Create Date: 2024-05-29 23:08:43.935040
"""
from typing import Sequence, Union
from alembic import op
from sqlalchemy.engine.reflection import Inspector
# revision identifiers, used by Alembic.
revision: str = "3bb0ddf32dfb"
down_revision: Union[str, None] = "a72f5cf9c2f9"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
conn = op.get_bind()
inspector = Inspector.from_engine(conn) # type: ignore
# ### commands auto generated by Alembic - please adjust! ###
indexes_names = [index["name"] for index in inspector.get_indexes("flow")]
constraints_names = [constraint["name"] for constraint in inspector.get_unique_constraints("flow")]
with op.batch_alter_table("flow", schema=None) as batch_op:
if "ix_flow_endpoint_name" in indexes_names:
batch_op.drop_index("ix_flow_endpoint_name")
batch_op.create_index(batch_op.f("ix_flow_endpoint_name"), ["endpoint_name"], unique=False)
if "unique_flow_endpoint_name" not in constraints_names:
batch_op.create_unique_constraint("unique_flow_endpoint_name", ["user_id", "endpoint_name"])
if "unique_flow_name" not in constraints_names:
batch_op.create_unique_constraint("unique_flow_name", ["user_id", "name"])
# ### end Alembic commands ###
def downgrade() -> None:
conn = op.get_bind()
inspector = Inspector.from_engine(conn) # type: ignore
# ### commands auto generated by Alembic - please adjust! ###
indexes_names = [index["name"] for index in inspector.get_indexes("flow")]
constraints_names = [constraint["name"] for constraint in inspector.get_unique_constraints("flow")]
with op.batch_alter_table("flow", schema=None) as batch_op:
if "unique_flow_name" in constraints_names:
batch_op.drop_constraint("unique_flow_name", type_="unique")
if "unique_flow_endpoint_name" in constraints_names:
batch_op.drop_constraint("unique_flow_endpoint_name", type_="unique")
if "ix_flow_endpoint_name" in indexes_names:
batch_op.drop_index(batch_op.f("ix_flow_endpoint_name"))
batch_op.create_index("ix_flow_endpoint_name", ["endpoint_name"], unique=1)
# ### end Alembic commands ###

View file

@ -0,0 +1,52 @@
"""Add endpoint name col
Revision ID: a72f5cf9c2f9
Revises: 29fe8f1f806b
Create Date: 2024-05-29 21:44:04.240816
"""
from typing import Sequence, Union
import sqlalchemy as sa
import sqlmodel
from alembic import op
from sqlalchemy.engine.reflection import Inspector
# revision identifiers, used by Alembic.
revision: str = "a72f5cf9c2f9"
down_revision: Union[str, None] = "29fe8f1f806b"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
conn = op.get_bind()
inspector = Inspector.from_engine(conn) # type: ignore
# ### commands auto generated by Alembic - please adjust! ###
column_names = [column["name"] for column in inspector.get_columns("flow")]
indexes = inspector.get_indexes("flow")
index_names = [index["name"] for index in indexes]
with op.batch_alter_table("flow", schema=None) as batch_op:
if "endpoint_name" not in column_names:
batch_op.add_column(sa.Column("endpoint_name", sqlmodel.sql.sqltypes.AutoString(), nullable=True))
if "ix_flow_endpoint_name" not in index_names:
batch_op.create_index(batch_op.f("ix_flow_endpoint_name"), ["endpoint_name"], unique=True)
# ### end Alembic commands ###
def downgrade() -> None:
conn = op.get_bind()
inspector = Inspector.from_engine(conn) # type: ignore
# ### commands auto generated by Alembic - please adjust! ###
column_names = [column["name"] for column in inspector.get_columns("flow")]
indexes = inspector.get_indexes("flow")
index_names = [index["name"] for index in indexes]
with op.batch_alter_table("flow", schema=None) as batch_op:
if "ix_flow_endpoint_name" in index_names:
batch_op.drop_index(batch_op.f("ix_flow_endpoint_name"))
if "endpoint_name" in column_names:
batch_op.drop_column("endpoint_name")
# ### end Alembic commands ###

View file

@ -53,10 +53,10 @@ def get_all(
raise HTTPException(status_code=500, detail=str(exc)) from exc
@router.post("/run/{flow_id}", response_model=RunResponse, response_model_exclude_none=True)
@router.post("/run/{flow_id_or_name}", response_model=RunResponse, response_model_exclude_none=True)
async def simplified_run_flow(
db: Annotated[Session, Depends(get_session)],
flow_id: UUID,
flow_id_or_name: str,
input_request: SimplifiedAPIRequest = SimplifiedAPIRequest(),
stream: bool = False,
api_key_user: User = Depends(api_key_security),
@ -111,8 +111,21 @@ async def simplified_run_flow(
This endpoint provides a powerful interface for executing flows with enhanced flexibility and efficiency, supporting a wide range of applications by allowing for dynamic input and output configuration along with performance optimizations through session management and caching.
"""
session_id = input_request.session_id
endpoint_name = None
flow_id_str = None
try:
try:
flow_id = UUID(flow_id_or_name)
except ValueError:
endpoint_name = flow_id_or_name
flow = db.exec(
select(Flow).where(Flow.endpoint_name == endpoint_name).where(Flow.user_id == api_key_user.id)
).first()
if flow is None:
raise ValueError(f"Flow with endpoint name {endpoint_name} not found")
flow_id = flow.id
flow_id_str = str(flow_id)
artifacts = {}
if input_request.session_id:
@ -172,10 +185,13 @@ async def simplified_run_flow(
# This means the Flow ID is not a valid UUID which means it can't find the flow
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=str(exc)) from exc
except ValueError as exc:
if f"Flow {flow_id_str} not found" in str(exc):
if flow_id_str and f"Flow {flow_id_str} not found" in str(exc):
logger.error(f"Flow {flow_id_str} not found")
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=str(exc)) from exc
elif f"Session {session_id} not found" in str(exc):
elif endpoint_name and f"Flow with endpoint name {endpoint_name} not found" in str(exc):
logger.error(f"Flow with endpoint name {endpoint_name} not found")
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=str(exc)) from exc
elif session_id and f"Session {session_id} not found" in str(exc):
logger.error(f"Session {session_id} not found")
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=str(exc)) from exc
else:

View file

@ -135,30 +135,49 @@ def update_flow(
settings_service=Depends(get_settings_service),
):
"""Update a flow."""
try:
db_flow = read_flow(
session=session,
flow_id=flow_id,
current_user=current_user,
settings_service=settings_service,
)
if not db_flow:
raise HTTPException(status_code=404, detail="Flow not found")
flow_data = flow.model_dump(exclude_unset=True)
if settings_service.settings.remove_api_keys:
flow_data = remove_api_keys(flow_data)
for key, value in flow_data.items():
if value is not None:
setattr(db_flow, key, value)
db_flow.updated_at = datetime.now(timezone.utc)
if db_flow.folder_id is None:
default_folder = session.exec(select(Folder).where(Folder.name == DEFAULT_FOLDER_NAME)).first()
if default_folder:
db_flow.folder_id = default_folder.id
session.add(db_flow)
session.commit()
session.refresh(db_flow)
return db_flow
except Exception as e:
# If it is a validation error, return the error message
if hasattr(e, "errors"):
raise HTTPException(status_code=400, detail=str(e)) from e
elif "UNIQUE constraint failed" in str(e):
# Get the name of the column that failed
columns = str(e).split("UNIQUE constraint failed: ")[1].split(".")[1].split("\n")[0]
# UNIQUE constraint failed: flow.user_id, flow.name
# or UNIQUE constraint failed: flow.name
# if the column has id in it, we want the other column
column = columns.split(",")[1] if "id" in columns.split(",")[0] else columns.split(",")[0]
db_flow = read_flow(
session=session,
flow_id=flow_id,
current_user=current_user,
settings_service=settings_service,
)
if not db_flow:
raise HTTPException(status_code=404, detail="Flow not found")
flow_data = flow.model_dump(exclude_unset=True)
if settings_service.settings.remove_api_keys:
flow_data = remove_api_keys(flow_data)
for key, value in flow_data.items():
if value is not None:
setattr(db_flow, key, value)
db_flow.updated_at = datetime.now(timezone.utc)
if db_flow.folder_id is None:
default_folder = session.exec(select(Folder).where(Folder.name == DEFAULT_FOLDER_NAME)).first()
if default_folder:
db_flow.folder_id = default_folder.id
session.add(db_flow)
session.commit()
session.refresh(db_flow)
return db_flow
raise HTTPException(
status_code=400, detail=f"{column.capitalize().replace('_', ' ')} must be unique"
) from e
elif isinstance(e, HTTPException):
raise e
else:
raise HTTPException(status_code=500, detail=str(e)) from e
@router.delete("/{flow_id}", status_code=200)

View file

@ -1,5 +1,6 @@
# Path: src/backend/langflow/services/database/models/flow/model.py
import re
import warnings
from datetime import datetime, timezone
from typing import TYPE_CHECKING, Dict, Optional
@ -7,7 +8,9 @@ from uuid import UUID, uuid4
import emoji
from emoji import purely_emoji # type: ignore
from fastapi import HTTPException, status
from pydantic import field_serializer, field_validator
from sqlalchemy import UniqueConstraint
from sqlmodel import JSON, Column, Field, Relationship, SQLModel
from langflow.schema.schema import Record
@ -26,6 +29,24 @@ class FlowBase(SQLModel):
is_component: Optional[bool] = Field(default=False, nullable=True)
updated_at: Optional[datetime] = Field(default_factory=lambda: datetime.now(timezone.utc), nullable=True)
folder_id: Optional[UUID] = Field(default=None, nullable=True)
endpoint_name: Optional[str] = Field(default=None, nullable=True, index=True)
@field_validator("endpoint_name")
@classmethod
def validate_endpoint_name(cls, v):
# Endpoint name must be a string containing only letters, numbers, hyphens, and underscores
if v is not None:
if not isinstance(v, str):
raise HTTPException(
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY,
detail="Endpoint name must be a string",
)
if not re.match(r"^[a-zA-Z0-9_-]+$", v):
raise HTTPException(
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY,
detail="Endpoint name must contain only letters, numbers, hyphens, and underscores",
)
return v
@field_validator("icon_bg_color")
def validate_icon_bg_color(cls, v):
@ -128,6 +149,11 @@ class Flow(FlowBase, table=True):
record = Record(data=data)
return record
__table_args__ = (
UniqueConstraint("user_id", "name", name="unique_flow_name"),
UniqueConstraint("user_id", "endpoint_name", name="unique_flow_endpoint_name"),
)
class FlowCreate(FlowBase):
user_id: Optional[UUID] = None
@ -145,3 +171,21 @@ class FlowUpdate(SQLModel):
description: Optional[str] = None
data: Optional[Dict] = None
folder_id: Optional[UUID] = None
endpoint_name: Optional[str] = None
@field_validator("endpoint_name")
@classmethod
def validate_endpoint_name(cls, v):
# Endpoint name must be a string containing only letters, numbers, hyphens, and underscores
if v is not None:
if not isinstance(v, str):
raise HTTPException(
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY,
detail="Endpoint name must be a string",
)
if not re.match(r"^[a-zA-Z0-9_-]+$", v):
raise HTTPException(
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY,
detail="Endpoint name must contain only letters, numbers, hyphens, and underscores",
)
return v

View file

@ -1,6 +1,7 @@
from typing import TYPE_CHECKING, List, Optional
from uuid import UUID, uuid4
from sqlalchemy import UniqueConstraint
from sqlmodel import Field, Relationship, SQLModel
from langflow.services.database.models.flow.model import FlowRead
@ -30,6 +31,8 @@ class Folder(FolderBase, table=True):
back_populates="folder", sa_relationship_kwargs={"cascade": "all, delete, delete-orphan"}
)
__table_args__ = (UniqueConstraint("user_id", "name", name="unique_folder_name"),)
class FolderCreate(FolderBase):
components_list: Optional[List[UUID]] = None

View file

@ -0,0 +1,65 @@
from sqlalchemy.engine.reflection import Inspector
def table_exists(name, conn):
"""
Check if a table exists.
Parameters:
name (str): The name of the table to check.
conn (sqlalchemy.engine.Engine or sqlalchemy.engine.Connection): The SQLAlchemy engine or connection to use.
Returns:
bool: True if the table exists, False otherwise.
"""
inspector = Inspector.from_engine(conn)
return name in inspector.get_table_names()
def column_exists(table_name, column_name, conn):
"""
Check if a column exists in a table.
Parameters:
table_name (str): The name of the table to check.
column_name (str): The name of the column to check.
conn (sqlalchemy.engine.Engine or sqlalchemy.engine.Connection): The SQLAlchemy engine or connection to use.
Returns:
bool: True if the column exists, False otherwise.
"""
inspector = Inspector.from_engine(conn)
return column_name in [column["name"] for column in inspector.get_columns(table_name)]
def foreign_key_exists(table_name, fk_name, conn):
"""
Check if a foreign key exists in a table.
Parameters:
table_name (str): The name of the table to check.
fk_name (str): The name of the foreign key to check.
conn (sqlalchemy.engine.Engine or sqlalchemy.engine.Connection): The SQLAlchemy engine or connection to use.
Returns:
bool: True if the foreign key exists, False otherwise.
"""
inspector = Inspector.from_engine(conn)
return fk_name in [fk["name"] for fk in inspector.get_foreign_keys(table_name)]
def constraint_exists(table_name, constraint_name, conn):
"""
Check if a constraint exists in a table.
Parameters:
table_name (str): The name of the table to check.
constraint_name (str): The name of the constraint to check.
conn (sqlalchemy.engine.Engine or sqlalchemy.engine.Connection): The SQLAlchemy engine or connection to use.
Returns:
bool: True if the constraint exists, False otherwise.
"""
inspector = Inspector.from_engine(conn)
constraints = inspector.get_unique_constraints(table_name)
return constraint_name in [constraint["name"] for constraint in constraints]

View file

@ -9,11 +9,14 @@ export const EditFlowSettings: React.FC<InputProps> = ({
name,
invalidNameList,
description,
endpointName,
maxLength = 50,
setName,
setDescription,
setEndpointName,
}: InputProps): JSX.Element => {
const [isMaxLength, setIsMaxLength] = useState(false);
const [isEndpointNameValid, setIsEndpointNameValid] = useState(true);
const handleNameChange = (event: ChangeEvent<HTMLInputElement>) => {
const { value } = event.target;
@ -29,6 +32,18 @@ export const EditFlowSettings: React.FC<InputProps> = ({
setDescription!(event.target.value);
};
const handleEndpointNameChange = (event: ChangeEvent<HTMLInputElement>) => {
// Validate the endpoint name
// use this regex r'^[a-zA-Z0-9_-]+$'
const isValid =
(/^[a-zA-Z0-9_-]+$/.test(event.target.value) &&
event.target.value.length <= maxLength) ||
// empty is also valid
event.target.value.length === 0;
setIsEndpointNameValid(isValid);
setEndpointName!(event.target.value);
};
//this function is necessary to select the text when double clicking, this was not working with the onFocus event
const handleFocus = (event) => event.target.select();
@ -91,6 +106,32 @@ export const EditFlowSettings: React.FC<InputProps> = ({
</span>
)}
</Label>
{setEndpointName && (
<Label>
<div className="edit-flow-arrangement mt-3">
<span className="font-medium">Endpoint name:</span>
{!isEndpointNameValid && (
<span className="edit-flow-span">
Invalid endpoint name. Use only letters, numbers, hyphens, and
underscores ({maxLength} characters max).
</span>
)}
</div>
<Input
className="nopan nodelete nodrag noundo nocopy mt-2 font-normal"
onChange={handleEndpointNameChange}
type="text"
name="endpoint_name"
value={endpointName ?? ""}
placeholder="An alternative name for the run endpoint"
maxLength={maxLength}
id="endpoint_name"
onDoubleClickCapture={(event) => {
handleFocus(event);
}}
/>
</Label>
)}
</>
);
};

View file

@ -33,7 +33,7 @@ const SideBarFoldersButtonsComponent = ({
const [foldersNames, setFoldersNames] = useState({});
const takeSnapshot = useFlowsManagerStore((state) => state.takeSnapshot);
const [editFolders, setEditFolderName] = useState(
folders.map((obj) => ({ name: obj.name, edit: false })),
folders.map((obj) => ({ name: obj.name, edit: false }))
);
const uploadFolder = useFolderStore((state) => state.uploadFolder);
const currentFolder = pathname.split("/");
@ -58,7 +58,7 @@ const SideBarFoldersButtonsComponent = ({
const { dragOver, dragEnter, dragLeave, onDrop } = useFileDrop(
folderId,
handleFolderChange,
handleFolderChange
);
const handleUploadFlowsToFolder = () => {
@ -73,7 +73,7 @@ const SideBarFoldersButtonsComponent = ({
addFolder({ name: "New Folder", parent_id: null, description: "" }).then(
(res) => {
getFoldersApi(true);
},
}
);
}
@ -91,8 +91,6 @@ const SideBarFoldersButtonsComponent = ({
folders.map((obj) => ({ name: obj.name, edit: false }));
}, [folders]);
return (
<>
<div className="flex shrink-0 items-center justify-between">
@ -120,7 +118,7 @@ const SideBarFoldersButtonsComponent = ({
<>
{folders.map((item, index) => {
const editFolderName = editFolders?.filter(
(folder) => folder.name === item.name,
(folder) => folder.name === item.name
)[0];
return (
<div
@ -136,7 +134,7 @@ const SideBarFoldersButtonsComponent = ({
? "border border-border bg-muted hover:bg-muted"
: "border hover:bg-transparent lg:border-transparent lg:hover:border-border",
"group flex w-full shrink-0 cursor-pointer gap-2 opacity-100 lg:min-w-full",
folderIdDragging === item.id! ? "bg-border" : "",
folderIdDragging === item.id! ? "bg-border" : ""
)}
onClick={() => handleChangeFolder!(item.id!)}
>
@ -206,7 +204,7 @@ const SideBarFoldersButtonsComponent = ({
folders.map((obj) => ({
name: obj.name,
edit: false,
})),
}))
);
}
if (e.key === "Enter") {
@ -239,10 +237,10 @@ const SideBarFoldersButtonsComponent = ({
};
const updatedFolder = await updateFolder(
body,
item.id!,
item.id!
);
const updateFolders = folders.filter(
(f) => f.name !== item.name,
(f) => f.name !== item.name
);
setFolders([...updateFolders, updatedFolder]);
setFoldersNames({});
@ -250,7 +248,7 @@ const SideBarFoldersButtonsComponent = ({
folders.map((obj) => ({
name: obj.name,
edit: false,
})),
}))
);
} else {
setFoldersNames((old) => ({

View file

@ -61,7 +61,7 @@ export async function sendAll(data: sendAllProps) {
}
export async function postValidateCode(
code: string,
code: string
): Promise<AxiosResponse<errorsTypeAPI>> {
return await api.post(`${BASE_URL_API}validate/code`, { code });
}
@ -76,7 +76,7 @@ export async function postValidateCode(
export async function postValidatePrompt(
name: string,
template: string,
frontend_node: APIClassType,
frontend_node: APIClassType
): Promise<AxiosResponse<PromptTypeAPI>> {
return api.post(`${BASE_URL_API}validate/prompt`, {
name,
@ -149,7 +149,7 @@ export async function saveFlowToDatabase(newFlow: {
* @throws Will throw an error if the update fails.
*/
export async function updateFlowInDatabase(
updatedFlow: FlowType,
updatedFlow: FlowType
): Promise<FlowType> {
try {
const response = await api.patch(`${BASE_URL_API}flows/${updatedFlow.id}`, {
@ -157,6 +157,7 @@ export async function updateFlowInDatabase(
data: updatedFlow.data,
description: updatedFlow.description,
folder_id: updatedFlow.folder_id === "" ? null : updatedFlow.folder_id,
endpoint_name: updatedFlow.endpoint_name,
});
if (response?.status !== 200) {
@ -326,7 +327,7 @@ export async function getHealth() {
*
*/
export async function getBuildStatus(
flowId: string,
flowId: string
): Promise<AxiosResponse<BuildStatusTypeAPI>> {
return await api.get(`${BASE_URL_API}build/${flowId}/status`);
}
@ -339,7 +340,7 @@ export async function getBuildStatus(
*
*/
export async function postBuildInit(
flow: FlowType,
flow: FlowType
): Promise<AxiosResponse<InitTypeAPI>> {
return await api.post(`${BASE_URL_API}build/init/${flow.id}`, flow);
}
@ -355,7 +356,7 @@ export async function postBuildInit(
*/
export async function uploadFile(
file: File,
id: string,
id: string
): Promise<AxiosResponse<UploadFileTypeAPI>> {
const formData = new FormData();
formData.append("file", file);
@ -364,7 +365,7 @@ export async function uploadFile(
export async function postCustomComponent(
code: string,
apiClass: APIClassType,
apiClass: APIClassType
): Promise<AxiosResponse<APIClassType>> {
// let template = apiClass.template;
return await api.post(`${BASE_URL_API}custom_component`, {
@ -377,7 +378,7 @@ export async function postCustomComponentUpdate(
code: string,
template: APITemplateType,
field: string,
field_value: any,
field_value: any
): Promise<AxiosResponse<APIClassType>> {
return await api.post(`${BASE_URL_API}custom_component/update`, {
code,
@ -399,7 +400,7 @@ export async function onLogin(user: LoginType) {
headers: {
"Content-Type": "application/x-www-form-urlencoded",
},
},
}
);
if (response.status === 200) {
@ -461,11 +462,11 @@ export async function addUser(user: UserInputType): Promise<Array<Users>> {
export async function getUsersPage(
skip: number,
limit: number,
limit: number
): Promise<Array<Users>> {
try {
const res = await api.get(
`${BASE_URL_API}users/?skip=${skip}&limit=${limit}`,
`${BASE_URL_API}users/?skip=${skip}&limit=${limit}`
);
if (res.status === 200) {
return res.data;
@ -502,7 +503,7 @@ export async function resetPassword(user_id: string, user: resetPasswordType) {
try {
const res = await api.patch(
`${BASE_URL_API}users/${user_id}/reset-password`,
user,
user
);
if (res.status === 200) {
return res.data;
@ -576,7 +577,7 @@ export async function saveFlowStore(
last_tested_version?: string;
},
tags: string[],
publicFlow = false,
publicFlow = false
): Promise<FlowType> {
try {
const response = await api.post(`${BASE_URL_API}store/components/`, {
@ -705,7 +706,7 @@ export async function postStoreComponents(component: Component) {
export async function getComponent(component_id: string) {
try {
const res = await api.get(
`${BASE_URL_API}store/components/${component_id}`,
`${BASE_URL_API}store/components/${component_id}`
);
if (res.status === 200) {
return res.data;
@ -720,7 +721,7 @@ export async function searchComponent(
page?: number | null,
limit?: number | null,
status?: string | null,
tags?: string[],
tags?: string[]
): Promise<StoreComponentResponse | undefined> {
try {
let url = `${BASE_URL_API}store/components/`;
@ -832,7 +833,7 @@ export async function updateFlowStore(
},
tags: string[],
publicFlow = false,
id: string,
id: string
): Promise<FlowType> {
try {
const response = await api.patch(`${BASE_URL_API}store/components/${id}`, {
@ -916,7 +917,7 @@ export async function deleteGlobalVariable(id: string) {
export async function updateGlobalVariable(
name: string,
value: string,
id: string,
id: string
) {
try {
const response = api.patch(`${BASE_URL_API}variables/${id}`, {
@ -935,7 +936,7 @@ export async function getVerticesOrder(
startNodeId?: string | null,
stopNodeId?: string | null,
nodes?: Node[],
Edges?: Edge[],
Edges?: Edge[]
): Promise<AxiosResponse<VerticesOrderTypeAPI>> {
// nodeId is optional and is a query parameter
// if nodeId is not provided, the API will return all vertices
@ -955,19 +956,19 @@ export async function getVerticesOrder(
return await api.post(
`${BASE_URL_API}build/${flowId}/vertices`,
data,
config,
config
);
}
export async function postBuildVertex(
flowId: string,
vertexId: string,
input_value: string,
input_value: string
): Promise<AxiosResponse<VertexBuildTypeAPI>> {
// input_value is optional and is a query parameter
return await api.post(
`${BASE_URL_API}build/${flowId}/vertices/${vertexId}`,
input_value ? { inputs: { input_value: input_value } } : undefined,
input_value ? { inputs: { input_value: input_value } } : undefined
);
}
@ -991,7 +992,7 @@ export async function getFlowPool({
}
export async function deleteFlowPool(
flowId: string,
flowId: string
): Promise<AxiosResponse<any>> {
const config = {};
config["params"] = { flow_id: flowId };
@ -999,7 +1000,7 @@ export async function deleteFlowPool(
}
export async function multipleDeleteFlowsComponents(
flowIds: string[],
flowIds: string[]
): Promise<AxiosResponse<any>> {
return await api.post(`${BASE_URL_API}flows/multiple_delete/`, {
flow_ids: flowIds,
@ -1009,7 +1010,7 @@ export async function multipleDeleteFlowsComponents(
export async function getTransactionTable(
id: string,
mode: "intersection" | "union",
params = {},
params = {}
): Promise<{ rows: Array<object>; columns: Array<ColDef | ColGroupDef> }> {
const config = {};
config["params"] = { flow_id: id };
@ -1024,7 +1025,7 @@ export async function getTransactionTable(
export async function getMessagesTable(
id: string,
mode: "intersection" | "union",
params = {},
params = {}
): Promise<{ rows: Array<object>; columns: Array<ColDef | ColGroupDef> }> {
const config = {};
config["params"] = { flow_id: id };

View file

@ -8,13 +8,14 @@ export default function getCurlCode(
flowId: string,
isAuth: boolean,
tweaksBuildedObject,
endpointName?: string
): string {
const tweaksObject = tweaksBuildedObject[0];
// show the endpoint name in the curl command if it exists
return `curl -X POST \\
${window.location.protocol}//${
window.location.host
}/api/v1/run/${flowId}?stream=false \\
${window.location.protocol}//${window.location.host}/api/v1/run/${
endpointName || flowId
}?stream=false \\
-H 'Content-Type: application/json'\\${
!isAuth ? `\n -H 'x-api-key: <your api key>'\\` : ""
}

View file

@ -18,11 +18,11 @@ import { buildContent } from "../utils/build-content";
import { buildTweaks } from "../utils/build-tweaks";
import { checkCanBuildTweakObject } from "../utils/check-can-build-tweak-object";
import { getChangesType } from "../utils/get-changes-types";
import { getNodesWithDefaultValue } from "../utils/get-nodes-with-default-value";
import { getValue } from "../utils/get-value";
import getPythonApiCode from "../utils/get-python-api-code";
import getCurlCode from "../utils/get-curl-code";
import { getNodesWithDefaultValue } from "../utils/get-nodes-with-default-value";
import getPythonApiCode from "../utils/get-python-api-code";
import getPythonCode from "../utils/get-python-code";
import { getValue } from "../utils/get-value";
import getWidgetCode from "../utils/get-widget-code";
import tabsArray from "../utils/tabs-array";
@ -35,7 +35,7 @@ const ApiModal = forwardRef(
flow: FlowType;
children: ReactNode;
},
ref,
ref
) => {
const tweak = useTweaksStore((state) => state.tweak);
const addTweaks = useTweaksStore((state) => state.setTweak);
@ -47,7 +47,12 @@ const ApiModal = forwardRef(
const [open, setOpen] = useState(false);
const [activeTab, setActiveTab] = useState("0");
const pythonApiCode = getPythonApiCode(flow?.id, autoLogin, tweak);
const curl_code = getCurlCode(flow?.id, autoLogin, tweak);
const curl_code = getCurlCode(
flow?.id,
autoLogin,
tweak,
flow?.endpoint_name
);
const pythonCode = getPythonCode(flow?.name, tweak);
const widgetCode = getWidgetCode(flow?.id, flow?.name, autoLogin);
const tweaksCode = buildTweaks(flow);
@ -106,7 +111,7 @@ const ApiModal = forwardRef(
buildTweakObject(
nodeId,
element.data.node.template[templateField].value,
element.data.node.template[templateField],
element.data.node.template[templateField]
);
}
});
@ -123,7 +128,7 @@ const ApiModal = forwardRef(
async function buildTweakObject(
tw: string,
changes: string | string[] | boolean | number | Object[] | Object,
template: TemplateVariableType,
template: TemplateVariableType
) {
changes = getChangesType(changes, template);
@ -161,7 +166,12 @@ const ApiModal = forwardRef(
const addCodes = (cloneTweak) => {
const pythonApiCode = getPythonApiCode(flow?.id, autoLogin, cloneTweak);
const curl_code = getCurlCode(flow?.id, autoLogin, cloneTweak);
const curl_code = getCurlCode(
flow?.id,
autoLogin,
cloneTweak,
flow?.endpoint_name
);
const pythonCode = getPythonCode(flow?.name, cloneTweak);
const widgetCode = getWidgetCode(flow?.id, flow?.name, autoLogin);
@ -204,7 +214,7 @@ const ApiModal = forwardRef(
</BaseModal.Content>
</BaseModal>
);
},
}
);
export default ApiModal;

View file

@ -1,19 +1,16 @@
import { ColDef, ColGroupDef } from "ag-grid-community";
import { AxiosError } from "axios";
import { useEffect, useRef, useState } from "react";
import IconComponent from "../../components/genericIconComponent";
import TableComponent from "../../components/tableComponent";
import { Tabs, TabsList, TabsTrigger } from "../../components/ui/tabs";
import { getMessagesTable, getTransactionTable } from "../../controllers/API";
import useAlertStore from "../../stores/alertStore";
import useFlowStore from "../../stores/flowStore";
import useFlowsManagerStore from "../../stores/flowsManagerStore";
import { FlowSettingsPropsType } from "../../types/components";
import { FlowType, NodeDataType } from "../../types/flow";
import BaseModal from "../baseModal";
import TableComponent from "../../components/tableComponent";
import { getMessagesTable, getTransactionTable } from "../../controllers/API";
import {
ColDef,
ColGroupDef,
SizeColumnsToFitGridStrategy,
} from "ag-grid-community";
import useAlertStore from "../../stores/alertStore";
import useFlowStore from "../../stores/flowStore";
export default function FlowLogsModal({
open,
@ -41,8 +38,17 @@ export default function FlowLogsModal({
function handleClick(): void {
currentFlow!.name = name;
currentFlow!.description = description;
saveFlow(currentFlow!);
setOpen(false);
saveFlow(currentFlow!)
?.then(() => {
setOpen(false);
})
.catch((err) => {
useAlertStore.getState().setErrorData({
title: "Error while saving changes",
list: [(err as AxiosError).response?.data.detail ?? ""],
});
console.error(err);
});
}
useEffect(() => {
@ -66,7 +72,7 @@ export default function FlowLogsModal({
.some((template) => template["stream"] && template["stream"].value);
console.log(
haStream,
nodes.map((nodes) => (nodes.data as NodeDataType).node!.template),
nodes.map((nodes) => (nodes.data as NodeDataType).node!.template)
);
if (haStream) {
setNoticeData({

View file

@ -3,6 +3,7 @@ import EditFlowSettings from "../../components/editFlowSettingsComponent";
import IconComponent from "../../components/genericIconComponent";
import { Button } from "../../components/ui/button";
import { SETTINGS_DIALOG_SUBTITLE } from "../../constants/constants";
import useAlertStore from "../../stores/alertStore";
import useFlowsManagerStore from "../../stores/flowsManagerStore";
import { FlowSettingsPropsType } from "../../types/components";
import { FlowType } from "../../types/flow";
@ -22,12 +23,23 @@ export default function FlowSettingsModal({
const [name, setName] = useState(currentFlow!.name);
const [description, setDescription] = useState(currentFlow!.description);
const [endpoint_name, setEndpointName] = useState(currentFlow!.endpoint_name);
function handleClick(): void {
currentFlow!.name = name;
currentFlow!.description = description;
saveFlow(currentFlow!);
setOpen(false);
currentFlow!.endpoint_name = endpoint_name;
saveFlow(currentFlow!)
?.then(() => {
setOpen(false);
})
.catch((err) => {
useAlertStore.getState().setErrorData({
title: "Error while saving changes",
list: [(err as AxiosError).response?.data.detail ?? ""],
});
console.error(err);
});
}
const [nameLists, setNameList] = useState<string[]>([]);
@ -41,7 +53,7 @@ export default function FlowSettingsModal({
}, [flows]);
return (
<BaseModal open={open} setOpen={setOpen} size="smaller">
<BaseModal open={open} setOpen={setOpen} size="smaller-h-full">
<BaseModal.Header description={SETTINGS_DIALOG_SUBTITLE}>
<span className="pr-2">Settings</span>
<IconComponent name="Settings2" className="mr-2 h-4 w-4 " />
@ -51,8 +63,10 @@ export default function FlowSettingsModal({
invalidNameList={nameLists}
name={name}
description={description}
endpointName={endpoint_name}
setName={setName}
setDescription={setDescription}
setEndpointName={setEndpointName}
/>
</BaseModal.Content>

View file

@ -1,4 +1,3 @@
import { AxiosError } from "axios";
import { cloneDeep, debounce } from "lodash";
import { Edge, Node, Viewport, XYPosition } from "reactflow";
import { create } from "zustand";
@ -87,12 +86,12 @@ const useFlowsManagerStore = create<FlowsManagerStoreType>((set, get) => ({
if (dbData) {
const { data, flows } = processFlows(dbData, false);
const examples = flows.filter(
(flow) => flow.folder_id === starterFolderId,
(flow) => flow.folder_id === starterFolderId
);
get().setExamples(examples);
const flowsWithoutStarterFolder = flows.filter(
(flow) => flow.folder_id !== starterFolderId,
(flow) => flow.folder_id !== starterFolderId
);
get().setFlows(flowsWithoutStarterFolder);
@ -120,7 +119,7 @@ const useFlowsManagerStore = create<FlowsManagerStoreType>((set, get) => ({
if (get().currentFlow) {
get().saveFlow(
{ ...get().currentFlow!, data: { nodes, edges, viewport } },
true,
true
);
}
},
@ -146,7 +145,7 @@ const useFlowsManagerStore = create<FlowsManagerStoreType>((set, get) => ({
return updatedFlow;
}
return flow;
}),
})
);
//update tabs state
@ -155,11 +154,9 @@ const useFlowsManagerStore = create<FlowsManagerStoreType>((set, get) => ({
}
})
.catch((err) => {
useAlertStore.getState().setErrorData({
title: "Error while saving changes",
list: [(err as AxiosError).message],
});
reject(err);
set({ saveLoading: false });
throw err;
});
});
}, SAVE_DEBOUNCE_TIME),
@ -197,7 +194,7 @@ const useFlowsManagerStore = create<FlowsManagerStoreType>((set, get) => ({
flow?: FlowType,
override?: boolean,
position?: XYPosition,
fromDragAndDrop?: boolean,
fromDragAndDrop?: boolean
): Promise<string | undefined> => {
if (newProject) {
let flowData = flow
@ -213,7 +210,7 @@ const useFlowsManagerStore = create<FlowsManagerStoreType>((set, get) => ({
const newFlow = createNewFlow(
flowData!,
flow!,
folder_id || my_collection_id!,
folder_id || my_collection_id!
);
const { id } = await saveFlowToDatabase(newFlow);
newFlow.id = id;
@ -236,7 +233,7 @@ const useFlowsManagerStore = create<FlowsManagerStoreType>((set, get) => ({
const newFlow = createNewFlow(
flowData!,
flow!,
folder_id || my_collection_id!,
folder_id || my_collection_id!
);
const newName = addVersionToDuplicates(newFlow, get().flows);
@ -272,7 +269,7 @@ const useFlowsManagerStore = create<FlowsManagerStoreType>((set, get) => ({
.getState()
.paste(
{ nodes: flow!.data!.nodes, edges: flow!.data!.edges },
position ?? { x: 10, y: 10 },
position ?? { x: 10, y: 10 }
);
}
},
@ -282,7 +279,7 @@ const useFlowsManagerStore = create<FlowsManagerStoreType>((set, get) => ({
multipleDeleteFlowsComponents(id)
.then(() => {
const { data, flows } = processFlows(
get().flows.filter((flow) => !id.includes(flow.id)),
get().flows.filter((flow) => !id.includes(flow.id))
);
get().setFlows(flows);
set({ isLoading: false });
@ -302,7 +299,7 @@ const useFlowsManagerStore = create<FlowsManagerStoreType>((set, get) => ({
deleteFlowFromDatabase(id)
.then(() => {
const { data, flows } = processFlows(
get().flows.filter((flow) => flow.id !== id),
get().flows.filter((flow) => flow.id !== id)
);
get().setFlows(flows);
set({ isLoading: false });
@ -324,7 +321,7 @@ const useFlowsManagerStore = create<FlowsManagerStoreType>((set, get) => ({
return new Promise<void>((resolve) => {
let componentFlow = get().flows.find(
(componentFlow) =>
componentFlow.is_component && componentFlow.name === key,
componentFlow.is_component && componentFlow.name === key
);
if (componentFlow) {
@ -372,7 +369,7 @@ const useFlowsManagerStore = create<FlowsManagerStoreType>((set, get) => ({
fileData,
undefined,
position,
true,
true
);
resolve(id);
}
@ -413,7 +410,7 @@ const useFlowsManagerStore = create<FlowsManagerStoreType>((set, get) => ({
return get().addFlow(
true,
createFlowComponent(component, useDarkStore.getState().version),
override,
override
);
},
takeSnapshot: () => {
@ -434,7 +431,7 @@ const useFlowsManagerStore = create<FlowsManagerStoreType>((set, get) => ({
if (pastLength > 0) {
past[currentFlowId] = past[currentFlowId].slice(
pastLength - defaultOptions.maxHistorySize + 1,
pastLength,
pastLength
);
past[currentFlowId].push(newState);

View file

@ -269,9 +269,11 @@ export type IconComponentProps = {
export type InputProps = {
name: string | null;
description: string | null;
endpointName?: string;
maxLength?: number;
setName?: (name: string) => void;
setDescription?: (description: string) => void;
setEndpointName?: (endpointName: string) => void;
invalidNameList?: string[];
};
@ -517,7 +519,7 @@ export type nodeToolbarPropsType = {
updateNodeCode?: (
newNodeClass: APIClassType,
code: string,
name: string,
name: string
) => void;
setShowState: (show: boolean | SetStateAction<boolean>) => void;
isOutdated?: boolean;
@ -567,7 +569,7 @@ export type chatMessagePropsType = {
updateChat: (
chat: ChatMessageType,
message: string,
stream_url?: string,
stream_url?: string
) => void;
};
@ -659,12 +661,12 @@ export type codeTabsPropsType = {
value: string,
node: NodeType,
template: TemplateVariableType,
tweak: tweakType,
tweak: tweakType
) => string;
buildTweakObject?: (
tw: string,
changes: string | string[] | boolean | number | Object[] | Object,
template: TemplateVariableType,
template: TemplateVariableType
) => Promise<string | void>;
};
activeTweaks?: boolean;

View file

@ -7,6 +7,7 @@ export type FlowType = {
id: string;
data: ReactFlowJsonObject | null;
description: string;
endpoint_name?: string;
style?: FlowStyleType;
is_component?: boolean;
last_tested_version?: string;

View file

@ -30,14 +30,14 @@ def json_style():
def test_create_flow(client: TestClient, json_flow: str, active_user, logged_in_headers):
flow = orjson.loads(json_flow)
data = flow["data"]
flow = FlowCreate(name="Test Flow", description="description", data=data)
response = client.post("api/v1/flows/", json=flow.dict(), headers=logged_in_headers)
flow = FlowCreate(name=str(uuid4()), description="description", data=data)
response = client.post("api/v1/flows/", json=flow.model_dump(), headers=logged_in_headers)
assert response.status_code == 201
assert response.json()["name"] == flow.name
assert response.json()["data"] == flow.data
# flow is optional so we can create a flow without a flow
flow = FlowCreate(name="Test Flow")
response = client.post("api/v1/flows/", json=flow.dict(exclude_unset=True), headers=logged_in_headers)
response = client.post("api/v1/flows/", json=flow.model_dump(exclude_unset=True), headers=logged_in_headers)
assert response.status_code == 201
assert response.json()["name"] == flow.name
assert response.json()["data"] == flow.data
@ -46,14 +46,14 @@ def test_create_flow(client: TestClient, json_flow: str, active_user, logged_in_
def test_read_flows(client: TestClient, json_flow: str, active_user, logged_in_headers):
flow_data = orjson.loads(json_flow)
data = flow_data["data"]
flow = FlowCreate(name="Test Flow", description="description", data=data)
response = client.post("api/v1/flows/", json=flow.dict(), headers=logged_in_headers)
flow = FlowCreate(name=str(uuid4()), description="description", data=data)
response = client.post("api/v1/flows/", json=flow.model_dump(), headers=logged_in_headers)
assert response.status_code == 201
assert response.json()["name"] == flow.name
assert response.json()["data"] == flow.data
flow = FlowCreate(name="Test Flow", description="description", data=data)
response = client.post("api/v1/flows/", json=flow.dict(), headers=logged_in_headers)
flow = FlowCreate(name=str(uuid4()), description="description", data=data)
response = client.post("api/v1/flows/", json=flow.model_dump(), headers=logged_in_headers)
assert response.status_code == 201
assert response.json()["name"] == flow.name
assert response.json()["data"] == flow.data
@ -67,7 +67,7 @@ def test_read_flow(client: TestClient, json_flow: str, active_user, logged_in_he
flow = orjson.loads(json_flow)
data = flow["data"]
flow = FlowCreate(name="Test Flow", description="description", data=data)
response = client.post("api/v1/flows/", json=flow.dict(), headers=logged_in_headers)
response = client.post("api/v1/flows/", json=flow.model_dump(), headers=logged_in_headers)
flow_id = response.json()["id"] # flow_id should be a UUID but is a string
# turn it into a UUID
flow_id = UUID(flow_id)
@ -83,7 +83,7 @@ def test_update_flow(client: TestClient, json_flow: str, active_user, logged_in_
data = flow["data"]
flow = FlowCreate(name="Test Flow", description="description", data=data)
response = client.post("api/v1/flows/", json=flow.dict(), headers=logged_in_headers)
response = client.post("api/v1/flows/", json=flow.model_dump(), headers=logged_in_headers)
flow_id = response.json()["id"]
updated_flow = FlowUpdate(
@ -91,7 +91,7 @@ def test_update_flow(client: TestClient, json_flow: str, active_user, logged_in_
description="updated description",
data=data,
)
response = client.patch(f"api/v1/flows/{flow_id}", json=updated_flow.dict(), headers=logged_in_headers)
response = client.patch(f"api/v1/flows/{flow_id}", json=updated_flow.model_dump(), headers=logged_in_headers)
assert response.status_code == 200
assert response.json()["name"] == updated_flow.name
@ -103,7 +103,7 @@ def test_delete_flow(client: TestClient, json_flow: str, active_user, logged_in_
flow = orjson.loads(json_flow)
data = flow["data"]
flow = FlowCreate(name="Test Flow", description="description", data=data)
response = client.post("api/v1/flows/", json=flow.dict(), headers=logged_in_headers)
response = client.post("api/v1/flows/", json=flow.model_dump(), headers=logged_in_headers)
flow_id = response.json()["id"]
response = client.delete(f"api/v1/flows/{flow_id}", headers=logged_in_headers)
assert response.status_code == 200
@ -223,8 +223,8 @@ def test_update_flow_idempotency(client: TestClient, json_flow: str, active_user
response = client.post("api/v1/flows/", json=flow_data.dict(), headers=logged_in_headers)
flow_id = response.json()["id"]
updated_flow = FlowCreate(name="Updated Flow", description="description", data=data)
response1 = client.put(f"api/v1/flows/{flow_id}", json=updated_flow.dict(), headers=logged_in_headers)
response2 = client.put(f"api/v1/flows/{flow_id}", json=updated_flow.dict(), headers=logged_in_headers)
response1 = client.put(f"api/v1/flows/{flow_id}", json=updated_flow.model_dump(), headers=logged_in_headers)
response2 = client.put(f"api/v1/flows/{flow_id}", json=updated_flow.model_dump(), headers=logged_in_headers)
assert response1.json() == response2.json()
@ -237,8 +237,8 @@ def test_update_nonexistent_flow(client: TestClient, json_flow: str, active_user
description="description",
data=data,
)
response = client.patch(f"api/v1/flows/{uuid}", json=updated_flow.dict(), headers=logged_in_headers)
assert response.status_code == 404
response = client.patch(f"api/v1/flows/{uuid}", json=updated_flow.model_dump(), headers=logged_in_headers)
assert response.status_code == 404, response.text
def test_delete_nonexistent_flow(client: TestClient, active_user, logged_in_headers):

View file

@ -652,7 +652,7 @@ def test_invalid_flow_id(client, created_api_key):
headers = {"x-api-key": created_api_key.api_key}
flow_id = "invalid-flow-id"
response = client.post(f"/api/v1/run/{flow_id}", headers=headers)
assert response.status_code == status.HTTP_422_UNPROCESSABLE_ENTITY, response.text
assert response.status_code == status.HTTP_404_NOT_FOUND, response.text
headers = {"x-api-key": created_api_key.api_key}
flow_id = UUID(int=0)
response = client.post(f"/api/v1/run/{flow_id}", headers=headers)