Add Google Gemini Component, improvements and bugfixes (#1243)

This release of Langflow introduces a range of enhancements and notable
new features aimed at improving functionality, code quality, and user
experience. Key updates include the introduction of the Google Gemini
Component, a pivotal addition that expands the capabilities of Langflow
in language model applications.

Significant improvements were made in the custom component
functionality, notably in the areas of serialization and validation.
Basic support to LangChain Runnables added to the `process` endpoint.

On the front end, we've implemented changes to improve the overall user
interface and interactions. These include updates to the GenericNode
component, adjustments in the UI for better handling of nodes in the
flow, and the addition of new icon components for a more intuitive user
experience.

Backend refinements include updates to the Anthropic LLM and FAISS
component, along with several bug fixes and code optimizations. These
changes enhance the stability and efficiency of Langflow.
This commit is contained in:
anovazzi1 2023-12-22 13:09:15 -03:00 committed by GitHub
commit 0d8cdeb073
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
91 changed files with 8789 additions and 2288 deletions

View file

@ -1 +1,2 @@
.venv/
.venv/
**/aws

17
.github/stale.yml vendored
View file

@ -1,17 +0,0 @@
# Number of days of inactivity before an issue becomes stale
daysUntilStale: 45
# Number of days of inactivity before a stale issue is closed
daysUntilClose: 7
# Issues with these labels will never be considered stale
exemptLabels:
- pinned
- security
# Label to use when marking an issue as stale
staleLabel: stale
# Comment to post when marking an issue as stale. Set to `false` to disable
markComment: >
This issue has been automatically marked as stale because it has not had
recent activity. It will be closed if no further activity occurs. Thank you
for your contributions.
# Comment to post when closing a stale issue. Set to `false` to disable
closeComment: false

View file

@ -40,5 +40,5 @@ jobs:
run: docker compose up --exit-code-from tests tests result_backend broker celeryworker db --build
continue-on-error: true
- name: Stop services
run: docker compose down
# - name: Stop services
# run: docker compose down

View file

@ -8,6 +8,7 @@ on:
- dev
paths:
- "pyproject.toml"
workflow_dispatch:
env:
POETRY_VERSION: "1.5.1"
@ -40,7 +41,7 @@ jobs:
generateReleaseNotes: true
prerelease: true
tag: v${{ steps.check-version.outputs.version }}
commit: main
commit: dev
- name: Publish to PyPI
env:
POETRY_PYPI_TOKEN_PYPI: ${{ secrets.PYPI_API_TOKEN }}

View file

@ -18,9 +18,12 @@ coverage:
--cov-report xml \
--cov-report term-missing:skip-covered
# allow passing arguments to pytest
tests:
@make install_backend
poetry run pytest tests --instafail
poetry run pytest tests --instafail $(args)
# Use like:
format:
poetry run ruff . --fix

View file

@ -115,7 +115,7 @@ Each option is detailed below:
- `--install-completion [bash|zsh|fish|powershell|pwsh]`: Installs completion for the specified shell.
- `--show-completion [bash|zsh|fish|powershell|pwsh]`: Shows completion for the specified shell, allowing you to copy it or customize the installation.
- `--backend-only`: This parameter, with a default value of `False`, allows running only the backend server without the frontend. It can also be set using the `LANGFLOW_BACKEND_ONLY` environment variable.
- `store`: This parameter, with a default value of `True`, enables the store features, use `--no-store` to deactivate it. It can be configured using the `LANGFLOW_STORE` environment variable.
- `--store`: This parameter, with a default value of `True`, enables the store features, use `--no-store` to deactivate it. It can be configured using the `LANGFLOW_STORE` environment variable.
These parameters are important for users who need to customize the behavior of Langflow, especially in development or specialized deployment scenarios. You may want to update the documentation to include these parameters for completeness and clarity.

20
cdk.Dockerfile Normal file
View file

@ -0,0 +1,20 @@
FROM --platform=linux/amd64 python:3.10-slim
WORKDIR /app
# Install Poetry
RUN apt-get update && apt-get install gcc g++ curl build-essential postgresql-server-dev-all -y
RUN curl -sSL https://install.python-poetry.org | python3 -
# # Add Poetry to PATH
ENV PATH="${PATH}:/root/.local/bin"
# # Copy the pyproject.toml and poetry.lock files
COPY poetry.lock pyproject.toml ./
# Copy the rest of the application codes
COPY ./ ./
# Install dependencies
RUN poetry config virtualenvs.create false && poetry install --no-interaction --no-ansi
RUN poetry add pymysql==1.0.2
CMD ["sh", "./container-cmd-cdk.sh"]

3
container-cmd-cdk.sh Normal file
View file

@ -0,0 +1,3 @@
export LANGFLOW_DATABASE_URL="mysql+pymysql://${username}:${password}@${host}:3306/${dbname}"
# echo $LANGFLOW_DATABASE_URL
uvicorn --factory src.backend.langflow.main:create_app --host 0.0.0.0 --port 7860 --reload --log-level debug

View file

@ -1,28 +1,35 @@
version: "3"
networks:
langflow:
services:
backend:
build:
context: ./
dockerfile: ./dev.Dockerfile
env_file:
- .env
ports:
- "7860:7860"
volumes:
- ./:/app
command: bash -c "uvicorn --factory src.backend.langflow.main:create_app --host 0.0.0.0 --port 7860 --reload"
networks:
- langflow
frontend:
build:
context: ./src/frontend
dockerfile: ./dev.Dockerfile
dockerfile: ./cdk.Dockerfile
args:
- BACKEND_URL=http://backend:7860
environment:
- VITE_PROXY_TARGET=http://backend:7860
ports:
- "3000:3000"
- "8080:3000"
volumes:
- ./src/frontend/public:/home/node/app/public
- ./src/frontend/src:/home/node/app/src
- ./src/frontend/package.json:/home/node/app/package.json
restart: on-failure
networks:
- langflow

View file

@ -34,6 +34,7 @@ The CustomComponent class serves as the foundation for creating custom component
| --------------------------------------------------------- |
| _`str`_, _`int`_, _`float`_, _`bool`_, _`list`_, _`dict`_ |
| _`langflow.field_typing.NestedDict`_ |
| _`langflow.field_typing.Prompt`_ |
| _`langchain.chains.base.Chain`_ |
| _`langchain.PromptTemplate`_ |
| _`langchain.llms.base.BaseLLM`_ |
@ -48,10 +49,17 @@ The CustomComponent class serves as the foundation for creating custom component
The difference between _`dict`_ and _`langflow.field_typing.NestedDict`_ is that one adds a simple key-value pair field, while the other opens a more robust dictionary editor.
<Admonition type="info">
Unlike Langchain types, base Python types do not add a
To use the _`Prompt`_ type, you must also add _`**kwargs`_ to the _`build`_ method. This is because the _`Prompt`_ type passes new arbitrary keyword arguments to it.
If you want to add the values of the variables to the template you defined, you must format the PromptTemplate inside the CustomComponent class.
</Admonition>
<Admonition type="info">
Unlike Langchain types, base Python types do not add a
[handle](../guidelines/components) to the field by default. To add handles,
use the _`input_types`_ key in the _`build_config`_ method.
</Admonition>
</Admonition>
- **build_config**: Used to define the configuration fields of the component (if applicable). It should always return a dictionary with specific keys representing the field names and corresponding configurations. This method is called when the code is processed (i.e., when you click _Check and Save_ in the code editor). It must follow the format described below:

1862
poetry.lock generated

File diff suppressed because it is too large Load diff

View file

@ -1,6 +1,6 @@
[tool.poetry]
name = "langflow"
version = "0.6.2"
version = "0.6.3"
description = "A Python package with a built-in web application"
authors = ["Logspace <contact@logspace.ai>"]
maintainers = [
@ -96,12 +96,14 @@ pillow = "^10.0.0"
metal-sdk = "^2.4.0"
markupsafe = "^2.1.3"
extract-msg = "^0.45.0"
jq = "^1.6.0"
# jq is not available for windows
jq = { version = "^1.6.0", markers = "sys_platform != 'win32'" }
boto3 = "^1.28.63"
numexpr = "^2.8.6"
qianfan = "0.0.5"
pgvector = "^0.2.3"
pyautogen = "^0.2.0"
langchain-google-genai = "^0.0.2"
[tool.poetry.group.dev.dependencies]
pytest-asyncio = "^0.23.1"

11
scripts/aws/.env.example Normal file
View file

@ -0,0 +1,11 @@
# Description: Example of .env file
# Usage: Copy this file to .env and change the values
# according to your needs
# Do not commit .env file to git
# Do not change .env.example file
#     You can set up a superuser's username and password
# If there is no need for user management, set LANGFLOW_AUTO_LOGIN=true and delete LANGFLOW_SUPERUSER and LANGFLOW_SUPERUSER_PASSWORD.
LANGFLOW_AUTO_LOGIN=false
LANGFLOW_SUPERUSER=admin
LANGFLOW_SUPERUSER_PASSWORD=123456

9
scripts/aws/.gitignore vendored Normal file
View file

@ -0,0 +1,9 @@
*.js
!jest.config.js
*.d.ts
node_modules
# CDK asset staging directory
.cdk.staging
cdk.out
!/lib

6
scripts/aws/.npmignore Normal file
View file

@ -0,0 +1,6 @@
*.ts
!*.d.ts
# CDK asset staging directory
.cdk.staging
cdk.out

54
scripts/aws/README.ja.md Normal file
View file

@ -0,0 +1,54 @@
# Langflow on AWS
**想定時間**: 30 分
## 説明
Langflow on AWS では、 [AWS Cloud Development Kit](https://aws.amazon.com/cdk/?nc2=type_a) (CDK) を用いて Langflow を AWS 上にデプロイする方法を学べます。
このチュートリアルは、AWS アカウントと AWS に関する基本的な知識を有していることを前提としています。
作成するアプリケーションのアーキテクチャです。
![langflow-archi](./img/langflow-archi.png)
AWS CDK によって [Application Load Balancer](https://aws.amazon.com/elasticloadbalancing/application-load-balancer/?nc1=h_ls)、[AWS Fargate](https://aws.amazon.com/fargate/?nc2=type_a)、[Amazon Aurora](https://aws.amazon.com/rds/aurora/?nc2=type_a) を作成します。
Auroraのシークレットは [AWS Secrets Manager](https://aws.amazon.com/secrets-manager/?nc2=type_a) によって管理されます。
Fargate のタスクはフロントエンドとバックエンドに分かれており、サービス検出によって通信します。
リソースをデプロイするだけであれば、上記の各サービスについて深い知識は必要ありません。
# 環境構築とデプロイ方法
1. [AWS CloudShell](https://us-east-1.console.aws.amazon.com/cloudshell/home?region=us-east-1)を開きます。
1. 以下のコマンドを実行します。
```shell
git clone https://github.com/aws-samples/cloud9-setup-for-prototyping
cd cloud9-setup-for-prototyping
./bin/bootstrap
```
1. `Done!` と表示されたら [AWS Cloud9](https://us-east-1.console.aws.amazon.com/cloud9control/home?region=us-east-1#/) から `cloud9-for-prototyping` を開きます。
![make-cloud9](./img/langflow-cloud9.png)
1. 以下のコマンドを実行します。
```shell
git clone -b aws-cdk https://github.com/logspace-ai/langflow.git
cd langflow/scripts/aws
cp .env.example .env # 環境設定を変える場合はこのファイル(.env)を編集してください。
npm ci
cdk bootstrap
cdk deploy
```
1. 表示される URL にアクセスします。
```shell
Outputs:
LangflowAppStack.NetworkURLXXXXXX = http://alb-XXXXXXXXXXX.elb.amazonaws.com
```
1. サインイン画面でユーザー名とパスワードを入力します。`.env`ファイルでユーザー名とパスワードを設定していない場合、ユーザー名は`admin`、パスワードは`123456`で設定されます。
![make-cloud9](./img/langflow-signin.png)
# 環境の削除
1. `Cloud9` で以下のコマンドを実行します。
```shell
bash delete-resources.sh
```
1. [AWS CloudFormation](https://us-east-1.console.aws.amazon.com/cloudformation/home?region=us-east-1#/getting-started)を開き、`aws-cloud9-cloud9-for-prototyping-XXXX` を選択して削除します。
![delete-cfn](./img/langflow-cfn.png)

53
scripts/aws/README.md Normal file
View file

@ -0,0 +1,53 @@
# Deploy Langflow on AWS
**Duration**: 30 minutes
## Introduction
In this tutorial, you will learn how to deploy langflow on AWS using [AWS Cloud Development Kit](https://aws.amazon.com/cdk/?nc2=type_a) (CDK).
This tutorial assumes you have an AWS account and basic knowledge of AWS.
The architecture of the application to be created:
![langflow-archi](./img/langflow-archi.png)
[Application Load Balancer](https://aws.amazon.com/elasticloadbalancing/application-load-balancer/?nc1=h_ls), [AWS Fargate](https://aws.amazon.com/fargate/?nc2=type_a) and [Amazon Aurora](https://aws.amazon.com/rds/aurora/?nc2=type_a) are created by AWS CDK.
The aurora's secrets are managed by [AWS Secrets Manager](https://aws.amazon.com/secrets-manager/?nc2=type_a).
The Fargate task is divided into a frontend and a backend, which communicate through service discovery.
If you just want to deploy resources, you do not need in-depth knowledge of each of the above services.
# How to set up your environment and deploy langflow
1. Open [AWS CloudShell](https://us-east-1.console.aws.amazon.com/cloudshell/home?region=us-east-1).
1. Run the following commands in Cloudshell:
```shell
git clone https://github.com/aws-samples/cloud9-setup-for-prototyping
cd cloud9-setup-for-prototyping
./bin/bootstrap
```
1. When you see `Done!` in Cloudshell, open `cloud9-for-prototyping` from [AWS Cloud9](https://us-east-1.console.aws.amazon.com/cloud9control/home?region=us-east-1#/).
![make-cloud9](./img/langflow-cloud9-en.png)
1. Run the following command in the Cloud9 terminal.
```shell
git clone -b aws-cdk https://github.com/logspace-ai/langflow.git
cd langflow/scripts/aws
cp .env.example .env # Edit this file if you need environment settings
npm ci
cdk bootstrap
cdk deploy
```
1. Access the URL displayed.
```shell
Outputs:
LangflowAppStack.NetworkURLXXXXXX = http://alb-XXXXXXXXXXX.elb.amazonaws.com
```
1. Enter your user name and password to sign in. If you have not set a user name and password in your `.env` file, the user name will be set to `admin` and the password to `123456`.
![signin-langflow](./img/langflow-signin.png)
# Cleanup
1. Run the following command in the Cloud9 terminal.
```shell
bash delete-resources.sh
```
1. Open [AWS CloudFormation](https://us-east-1.console.aws.amazon.com/cloudformation/home?region=us-east-1#/getting-started), select `aws-cloud9-cloud9-for-prototyping-XXXX` and delete it.
![delete-cfn](./img/langflow-cfn.png)

21
scripts/aws/bin/cdk.ts Normal file
View file

@ -0,0 +1,21 @@
#!/usr/bin/env node
import 'source-map-support/register';
import * as cdk from 'aws-cdk-lib';
import { LangflowAppStack } from '../lib/cdk-stack';
const app = new cdk.App();
new LangflowAppStack(app, 'LangflowAppStack', {
/* If you don't specify 'env', this stack will be environment-agnostic.
* Account/Region-dependent features and context lookups will not work,
* but a single synthesized template can be deployed anywhere. */
/* Uncomment the next line to specialize this stack for the AWS Account
* and Region that are implied by the current CLI configuration. */
// env: { account: process.env.CDK_DEFAULT_ACCOUNT, region: process.env.CDK_DEFAULT_REGION },
/* Uncomment the next line if you know exactly what Account and Region you
* want to deploy the stack to. */
// env: { account: '123456789012', region: 'us-east-1' },
/* For more information, see https://docs.aws.amazon.com/cdk/latest/guide/environments.html */
});

55
scripts/aws/cdk.json Normal file
View file

@ -0,0 +1,55 @@
{
"app": "npx ts-node --prefer-ts-exts bin/cdk.ts",
"watch": {
"include": [
"**"
],
"exclude": [
"README.md",
"cdk*.json",
"**/*.d.ts",
"**/*.js",
"tsconfig.json",
"package*.json",
"yarn.lock",
"node_modules",
"test"
]
},
"context": {
"@aws-cdk/aws-lambda:recognizeLayerVersion": true,
"@aws-cdk/core:checkSecretUsage": true,
"@aws-cdk/core:target-partitions": [
"aws",
"aws-cn"
],
"@aws-cdk-containers/ecs-service-extensions:enableDefaultLogDriver": true,
"@aws-cdk/aws-ec2:uniqueImdsv2TemplateName": true,
"@aws-cdk/aws-ecs:arnFormatIncludesClusterName": true,
"@aws-cdk/aws-iam:minimizePolicies": true,
"@aws-cdk/core:validateSnapshotRemovalPolicy": true,
"@aws-cdk/aws-codepipeline:crossAccountKeyAliasStackSafeResourceName": true,
"@aws-cdk/aws-s3:createDefaultLoggingPolicy": true,
"@aws-cdk/aws-sns-subscriptions:restrictSqsDescryption": true,
"@aws-cdk/aws-apigateway:disableCloudWatchRole": true,
"@aws-cdk/core:enablePartitionLiterals": true,
"@aws-cdk/aws-events:eventsTargetQueueSameAccount": true,
"@aws-cdk/aws-iam:standardizedServicePrincipals": true,
"@aws-cdk/aws-ecs:disableExplicitDeploymentControllerForCircuitBreaker": true,
"@aws-cdk/aws-iam:importedRoleStackSafeDefaultPolicyName": true,
"@aws-cdk/aws-s3:serverAccessLogsUseBucketPolicy": true,
"@aws-cdk/aws-route53-patters:useCertificate": true,
"@aws-cdk/customresources:installLatestAwsSdkDefault": false,
"@aws-cdk/aws-rds:databaseProxyUniqueResourceName": true,
"@aws-cdk/aws-codedeploy:removeAlarmsFromDeploymentGroup": true,
"@aws-cdk/aws-apigateway:authorizerChangeDeploymentLogicalId": true,
"@aws-cdk/aws-ec2:launchTemplateDefaultUserData": true,
"@aws-cdk/aws-secretsmanager:useAttachedSecretResourcePolicyForSecretTargetAttachments": true,
"@aws-cdk/aws-redshift:columnId": true,
"@aws-cdk/aws-stepfunctions-tasks:enableEmrServicePolicyV2": true,
"@aws-cdk/aws-ec2:restrictDefaultSecurityGroup": true,
"@aws-cdk/aws-apigateway:requestValidatorUniqueId": true,
"@aws-cdk/aws-kms:aliasNameRef": true,
"@aws-cdk/core:includePrefixInUniqueNameGeneration": true
}
}

View file

@ -0,0 +1,3 @@
docker stop $(docker ps -aq)
docker rm $(docker ps -aq)
docker rmi -f $(docker images -aq)

View file

@ -0,0 +1,4 @@
aws cloudformation delete-stack --stack-name LangflowAppStack
aws ecr delete-repository --repository-name langflow-backend-repository --force
aws ecr delete-repository --repository-name langflow-frontend-repository --force
# aws ecr describe-repositories --output json | jq -re ".repositories[].repositoryName"

Binary file not shown.

After

Width:  |  Height:  |  Size: 291 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 108 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 166 KiB

View file

@ -0,0 +1,8 @@
module.exports = {
testEnvironment: 'node',
roots: ['<rootDir>/test'],
testMatch: ['**/*.test.ts'],
transform: {
'^.+\\.tsx?$': 'ts-jest'
}
};

View file

@ -0,0 +1,64 @@
import * as cdk from 'aws-cdk-lib';
import { Construct } from 'constructs';
import * as ecs from 'aws-cdk-lib/aws-ecs'
import { Network, EcrRepository, FrontEndCluster, BackEndCluster, Rds, EcsIAM } from './construct';
// import * as sqs from 'aws-cdk-lib/aws-sqs';
export class LangflowAppStack extends cdk.Stack {
constructor(scope: Construct, id: string, props?: cdk.StackProps) {
super(scope, id, props);
// Arch
const arch = ecs.CpuArchitecture.X86_64
// VPC
const { vpc, cluster, alb, targetGroup, cloudmapNamespace, ecsFrontSG, ecsBackSG, dbSG, albSG, backendLogGroup, frontendLogGroup} = new Network(this, 'Network')
// ECR
const { ecrFrontEndRepository,ecrBackEndRepository} = new EcrRepository(this, 'Ecr', {
cloudmapNamespace:cloudmapNamespace,
arch:arch
})
// RDS
// VPCとSGのリソース情報をPropsとして引き渡す
const { rdsCluster } = new Rds(this, 'Rds', { vpc, dbSG })
// IAM
const { frontendTaskRole, frontendTaskExecutionRole, backendTaskRole, backendTaskExecutionRole } = new EcsIAM(this, 'EcsIAM',{
rdsCluster:rdsCluster
})
const backendService = new BackEndCluster(this, 'backend', {
cluster:cluster,
ecsBackSG:ecsBackSG,
ecrBackEndRepository:ecrBackEndRepository,
backendTaskRole:backendTaskRole,
backendTaskExecutionRole:backendTaskExecutionRole,
backendLogGroup:backendLogGroup,
cloudmapNamespace:cloudmapNamespace,
rdsCluster:rdsCluster,
alb:alb,
arch:arch
})
backendService.node.addDependency(rdsCluster);
const frontendService = new FrontEndCluster(this, 'frontend',{
cluster:cluster,
ecsFrontSG:ecsFrontSG,
ecrFrontEndRepository:ecrFrontEndRepository,
targetGroup: targetGroup,
backendServiceName: backendService.backendServiceName,
frontendTaskRole: frontendTaskRole,
frontendTaskExecutionRole: frontendTaskExecutionRole,
frontendLogGroup: frontendLogGroup,
cloudmapNamespace: cloudmapNamespace,
arch:arch
})
frontendService.node.addDependency(backendService);
// S3+CloudFront
// new Web(this,'Cloudfront-S3')
}
}

View file

@ -0,0 +1,105 @@
import { Duration } from 'aws-cdk-lib'
import { Construct } from 'constructs';
import {
aws_ec2 as ec2,
aws_ecs as ecs,
aws_ecr as ecr,
aws_rds as rds,
aws_servicediscovery as servicediscovery,
aws_iam as iam,
aws_logs as logs,
aws_elasticloadbalancingv2 as elb,
} from 'aws-cdk-lib';
import * as dotenv from 'dotenv';
const path = require('path');
dotenv.config({path: path.join(__dirname, "../../.env")});
interface BackEndProps {
cluster: ecs.Cluster
ecsBackSG:ec2.SecurityGroup
ecrBackEndRepository:ecr.Repository
backendTaskRole: iam.Role;
backendTaskExecutionRole: iam.Role;
backendLogGroup: logs.LogGroup;
cloudmapNamespace: servicediscovery.PrivateDnsNamespace;
rdsCluster:rds.DatabaseCluster
alb:elb.IApplicationLoadBalancer
arch:ecs.CpuArchitecture
}
export class BackEndCluster extends Construct {
readonly backendServiceName: string
constructor(scope: Construct, id: string, props:BackEndProps) {
super(scope, id)
const containerPort = 7860
// Secrets ManagerからDB認証情報を取ってくる
const secretsDB = props.rdsCluster.secret!;
// Create Backend Fargate Service
const backendTaskDefinition = new ecs.FargateTaskDefinition(
this,
'BackEndTaskDef',
{
memoryLimitMiB: 3072,
cpu: 1024,
executionRole: props.backendTaskExecutionRole,
runtimePlatform:{
operatingSystemFamily: ecs.OperatingSystemFamily.LINUX,
cpuArchitecture: props.arch,
},
taskRole: props.backendTaskRole,
}
);
backendTaskDefinition.addContainer('backendContainer', {
image: ecs.ContainerImage.fromEcrRepository(props.ecrBackEndRepository, "latest"),
containerName:'langflow-back-container',
logging: ecs.LogDriver.awsLogs({
streamPrefix: 'my-stream',
logGroup: props.backendLogGroup,
}),
environment:{
// user:pass@endpoint:port/dbname
// "LANGFLOW_DATABASE_URL" : `mysql+pymysql://${username}:${password}@${host}:3306/${dbname}`,
// "LANGFLOW_DATABASE_URL" : "sqlite:///./langflow.db",
// "LANGFLOW_LANGCHAIN_CACHE" : "SQLiteCache",
// "LANGFLOW_AUTO_LOGIN" : "false",
// "LANGFLOW_SUPERUSER" : "admin",
// "LANGFLOW_SUPERUSER_PASSWORD" : "1234567"
"LANGFLOW_AUTO_LOGIN" : process.env.LANGFLOW_AUTO_LOGIN ?? 'false',
"LANGFLOW_SUPERUSER" : process.env.LANGFLOW_SUPERUSER ?? "admin",
"LANGFLOW_SUPERUSER_PASSWORD" : process.env.LANGFLOW_SUPERUSER_PASSWORD ?? "123456"
},
portMappings: [
{
containerPort: containerPort,
protocol: ecs.Protocol.TCP,
},
],
// Secretの設定
secrets: {
"dbname": ecs.Secret.fromSecretsManager(secretsDB, 'dbname'),
"username": ecs.Secret.fromSecretsManager(secretsDB, 'username'),
"host": ecs.Secret.fromSecretsManager(secretsDB, 'host'),
"password": ecs.Secret.fromSecretsManager(secretsDB, 'password'),
},
});
this.backendServiceName = 'backend'
const backendService = new ecs.FargateService(this, 'BackEndService', {
cluster: props.cluster,
serviceName: this.backendServiceName,
taskDefinition: backendTaskDefinition,
enableExecuteCommand: true,
securityGroups: [props.ecsBackSG],
cloudMapOptions: {
cloudMapNamespace: props.cloudmapNamespace,
containerPort: containerPort,
dnsRecordType: servicediscovery.DnsRecordType.A,
dnsTtl: Duration.seconds(10),
name: this.backendServiceName
},
vpcSubnets: { subnetType: ec2.SubnetType.PRIVATE_WITH_EGRESS },
});
}
}

View file

@ -0,0 +1,65 @@
import { Construct } from 'constructs';
import * as ec2 from 'aws-cdk-lib/aws-ec2'
import * as rds from "aws-cdk-lib/aws-rds";
import * as cdk from 'aws-cdk-lib';
interface RdsProps {
vpc: ec2.Vpc
dbSG:ec2.SecurityGroup
}
export class Rds extends Construct{
readonly rdsCluster: rds.DatabaseCluster
constructor(scope: Construct, id:string, props: RdsProps){
super(scope, id);
const {vpc, dbSG} = props
const instanceType = ec2.InstanceType.of(ec2.InstanceClass.BURSTABLE4_GRAVITON, ec2.InstanceSize.MEDIUM)
// RDSのパスワードを自動生成してSecrets Managerに格納
const rdsCredentials = rds.Credentials.fromGeneratedSecret('db_user',{
secretName: 'langflow-DbSecret',
})
// DB クラスターのパラメータグループ作成
const clusterParameterGroup = new rds.ParameterGroup(scope, 'ClusterParameterGroup',{
engine: rds.DatabaseClusterEngine.auroraMysql({
version: rds.AuroraMysqlEngineVersion.VER_3_02_0
}),
description: 'for-langflow',
})
clusterParameterGroup.bindToCluster({})
// DB インスタンスのパラメタグループ作成
const instanceParameterGroup = new rds.ParameterGroup(scope, 'InstanceParameterGroup',{
engine: rds.DatabaseClusterEngine.auroraMysql({
version: rds.AuroraMysqlEngineVersion.VER_3_02_0,
}),
description: 'for-langflow',
})
instanceParameterGroup.bindToInstance({})
this.rdsCluster = new rds.DatabaseCluster(scope, 'LangflowDbCluster', {
engine: rds.DatabaseClusterEngine.auroraMysql({
version: rds.AuroraMysqlEngineVersion.VER_3_02_0,
}),
storageEncrypted: true,
credentials: rdsCredentials,
instanceIdentifierBase: 'langflow-instance',
vpc:vpc,
vpcSubnets:vpc.selectSubnets({
subnetGroupName: 'langflow-Isolated',
}),
securityGroups:[dbSG],
writer: rds.ClusterInstance.provisioned("WriterInstance", {
instanceType: instanceType,
enablePerformanceInsights: true,
parameterGroup:instanceParameterGroup,
}),
// 2台目以降はreaders:で設定
parameterGroup: clusterParameterGroup,
defaultDatabaseName: 'langflow',
})
}
}

View file

@ -0,0 +1,78 @@
import { RemovalPolicy } from 'aws-cdk-lib'
import * as ecr from 'aws-cdk-lib/aws-ecr'
import * as ecrdeploy from 'cdk-ecr-deployment'
import * as ecs from 'aws-cdk-lib/aws-ecs'
import * as servicediscovery from 'aws-cdk-lib/aws-servicediscovery'
import { DockerImageAsset, Platform } from 'aws-cdk-lib/aws-ecr-assets'
import * as path from "path";
import { Construct } from 'constructs'
interface ECRProps {
cloudmapNamespace: servicediscovery.PrivateDnsNamespace;
arch:ecs.CpuArchitecture;
}
export class EcrRepository extends Construct {
readonly ecrFrontEndRepository: ecr.Repository
readonly ecrBackEndRepository: ecr.Repository
constructor(scope: Construct, id: string, props: ECRProps) {
super(scope, id)
const imagePlatform = props.arch == ecs.CpuArchitecture.ARM64 ? Platform.LINUX_ARM64 : Platform.LINUX_AMD64
const backendPath = path.join(__dirname, "../../../../../", "langflow")
const frontendPath = path.join(__dirname, "../../../../src/", "frontend")
const excludeDir = ['node_modules','.git', 'cdk.out']
const LifecycleRule = {
tagStatus: ecr.TagStatus.ANY,
description: 'Delete more than 30 image',
maxImageCount: 30,
}
// リポジトリ作成
this.ecrFrontEndRepository = new ecr.Repository(scope, 'LangflowFrontEndRepository', {
repositoryName: 'langflow-frontend-repository',
removalPolicy: RemovalPolicy.RETAIN,
imageScanOnPush: true,
})
this.ecrBackEndRepository = new ecr.Repository(scope, 'LangflowBackEndRepository', {
repositoryName: 'langflow-backend-repository',
removalPolicy: RemovalPolicy.RETAIN,
imageScanOnPush: true,
})
// LifecycleRule作成
this.ecrFrontEndRepository.addLifecycleRule(LifecycleRule)
this.ecrBackEndRepository.addLifecycleRule(LifecycleRule)
// Create Docker Image Asset
const dockerFrontEndImageAsset = new DockerImageAsset(this, "DockerFrontEndImageAsset", {
directory: frontendPath,
file:"cdk.Dockerfile",
buildArgs:{
"BACKEND_URL":`http://backend.${props.cloudmapNamespace.namespaceName}:7860`
},
exclude: excludeDir,
platform: imagePlatform,
});
const dockerBackEndImageAsset = new DockerImageAsset(this, "DockerBackEndImageAsset", {
directory: backendPath,
file:"cdk.Dockerfile",
exclude: excludeDir,
platform: imagePlatform,
});
// Deploy Docker Image to ECR Repository
new ecrdeploy.ECRDeployment(this, "DeployFrontEndImage", {
src: new ecrdeploy.DockerImageName(dockerFrontEndImageAsset.imageUri),
dest: new ecrdeploy.DockerImageName(this.ecrFrontEndRepository.repositoryUri)
});
// Deploy Docker Image to ECR Repository
new ecrdeploy.ECRDeployment(this, "DeployBackEndImage", {
src: new ecrdeploy.DockerImageName(dockerBackEndImageAsset.imageUri),
dest: new ecrdeploy.DockerImageName(this.ecrBackEndRepository.repositoryUri)
});
}
}

View file

@ -0,0 +1,117 @@
import { Duration } from 'aws-cdk-lib'
import { Construct } from 'constructs'
import {
aws_ec2 as ec2,
aws_ecs as ecs,
aws_ecr as ecr,
aws_servicediscovery as servicediscovery,
aws_iam as iam,
aws_logs as logs,
aws_elasticloadbalancingv2 as elb,
} from 'aws-cdk-lib';
import { CpuArchitecture } from 'aws-cdk-lib/aws-ecs';
interface FrontEndProps {
cluster:ecs.Cluster
ecsFrontSG:ec2.SecurityGroup
ecrFrontEndRepository:ecr.Repository
targetGroup: elb.ApplicationTargetGroup;
backendServiceName: string;
frontendTaskRole: iam.Role;
frontendTaskExecutionRole: iam.Role;
frontendLogGroup: logs.LogGroup;
cloudmapNamespace: servicediscovery.PrivateDnsNamespace;
arch:ecs.CpuArchitecture;
}
export class FrontEndCluster extends Construct {
constructor(scope: Construct, id: string, props:FrontEndProps) {
super(scope, id)
const containerPort = 3000
const frontendTaskDefinition = new ecs.FargateTaskDefinition(
this,
'FrontendTaskDef',
{
memoryLimitMiB: 3072,
cpu: 1024,
executionRole: props.frontendTaskExecutionRole,
runtimePlatform:{
operatingSystemFamily: ecs.OperatingSystemFamily.LINUX,
cpuArchitecture: props.arch,
},
taskRole: props.frontendTaskRole,
}
);
const frontendServiceName = 'frontend'
frontendTaskDefinition.addContainer('frontendContainer', {
image: ecs.ContainerImage.fromEcrRepository(props.ecrFrontEndRepository, "latest"),
containerName:'langflow-front-container',
environment: {
BACKEND_SERVICE_NAME: props.backendServiceName,
BACKEND_URL: `http://${props.backendServiceName}.${props.cloudmapNamespace.namespaceName}:7860/`,
VITE_PROXY_TARGET: `http://${props.backendServiceName}.${props.cloudmapNamespace.namespaceName}:7860/`,
},
logging: ecs.LogDriver.awsLogs({
streamPrefix: 'my-stream',
logGroup: props.frontendLogGroup,
}),
portMappings: [
{
name:frontendServiceName,
containerPort: containerPort,
protocol: ecs.Protocol.TCP,
appProtocol:ecs.AppProtocol.http,
},
],
});
const frontendService = new ecs.FargateService(
this,
'FrontendService',
{
serviceName: frontendServiceName,
cluster: props.cluster,
desiredCount: 1,
assignPublicIp: false,
taskDefinition: frontendTaskDefinition,
enableExecuteCommand: true,
securityGroups: [props.ecsFrontSG],
cloudMapOptions: {
cloudMapNamespace: props.cloudmapNamespace,
containerPort: containerPort,
dnsRecordType: servicediscovery.DnsRecordType.A,
dnsTtl: Duration.seconds(10),
name: frontendServiceName
},
healthCheckGracePeriod: Duration.seconds(1000),
}
);
props.targetGroup.addTarget(frontendService);
// // Create ALB and ECS Fargate Service
// const frontService = new ecs_patterns.ApplicationLoadBalancedFargateService(
// this,
// "FrontEndService",
// {
// cluster: cluster,
// serviceName: 'langflow-frontend-service',
// cpu: 256,
// memoryLimitMiB: 512,
// listenerPort: 80,
// assignPublicIp: true, // Public facing - ALB
// taskSubnets: { subnetType: ec2.SubnetType.PRIVATE_WITH_EGRESS },
// securityGroups:[ecsFrontSG],
// taskImageOptions: {
// family: 'langflow-taskdef',
// containerName: 'langflow-front-container',
// image: ecs.ContainerImage.fromEcrRepository(ecrFrontEndRepository, "latest"),
// containerPort: 3000, // L2なので、TargetGroupのportが3000で設定されるはず
// },
// loadBalancer:alb,
// openListener:false,
// }
// );
}
}

View file

@ -0,0 +1,101 @@
import { RemovalPolicy, Duration } from 'aws-cdk-lib'
import { Construct } from 'constructs'
import {
aws_rds as rds,
aws_iam as iam,
} from 'aws-cdk-lib';
interface IAMProps {
rdsCluster:rds.DatabaseCluster
}
export class EcsIAM extends Construct {
readonly frontendTaskRole: iam.Role;
readonly frontendTaskExecutionRole: iam.Role;
readonly backendTaskRole: iam.Role;
readonly backendTaskExecutionRole: iam.Role;
constructor(scope: Construct, id: string, props:IAMProps) {
super(scope, id)
// Policy Statements
// ECS Policy State
const ECSExecPolicyStatement = new iam.PolicyStatement({
sid: 'allowECSExec',
resources: ['*'],
actions: [
'ecr:GetAuthorizationToken',
'ecr:BatchCheckLayerAvailability',
'ecr:GetDownloadUrlForLayer',
'ecr:BatchGetImage',
],
});
// Bedrock Policy State
const BedrockPolicyStatement = new iam.PolicyStatement({
sid: 'allowBedrockAccess',
resources: ['*'],
actions: [
'bedrock:*',
],
});
// Kendra Policy State
const KendraPolicyStatement = new iam.PolicyStatement({
sid: 'allowKendraAccess',
resources: ['*'],
actions: [
'kendra:*'
],
});
// Create Rag Policy
const RagAccessPolicy = new iam.Policy(this, 'RAGFullAccess', {
statements: [KendraPolicyStatement,BedrockPolicyStatement],
})
// Secrets ManagerからDB認証情報を取ってくるためのPolicy
const SecretsManagerPolicy = new iam.Policy(this, 'SMGetPolicy', {
statements: [new iam.PolicyStatement({
actions: ['secretsmanager:GetSecretValue'],
resources: [props.rdsCluster.secret!.secretArn],
})],
})
// FrontEnd Task Role
this.frontendTaskRole = new iam.Role(this, 'FrontendTaskRole', {
assumedBy: new iam.ServicePrincipal('ecs-tasks.amazonaws.com'),
});
this.frontendTaskRole.addToPolicy(ECSExecPolicyStatement);
// BackEnd Task Role
this.backendTaskRole = new iam.Role(this, 'BackendTaskRole', {
assumedBy: new iam.ServicePrincipal('ecs-tasks.amazonaws.com'),
});
// ECS Exec Policyの付与
this.backendTaskRole.addToPolicy(ECSExecPolicyStatement);
// KendraとBedrockのアクセス権付与
this.backendTaskRole.attachInlinePolicy(RagAccessPolicy);
// FrontEnd Task ExecutionRole
this.frontendTaskExecutionRole = new iam.Role(this, 'frontendTaskExecutionRole', {
assumedBy: new iam.ServicePrincipal('ecs-tasks.amazonaws.com'),
managedPolicies: [
{
managedPolicyArn:
'arn:aws:iam::aws:policy/service-role/AmazonECSTaskExecutionRolePolicy',
},
],
});
// BackEnd Task ExecutionRole
this.backendTaskExecutionRole = new iam.Role(this, 'backendTaskExecutionRole', {
assumedBy: new iam.ServicePrincipal('ecs-tasks.amazonaws.com'),
managedPolicies: [
{
managedPolicyArn:
'arn:aws:iam::aws:policy/service-role/AmazonECSTaskExecutionRolePolicy',
},
],
});
this.backendTaskExecutionRole.attachInlinePolicy(SecretsManagerPolicy);
this.backendTaskExecutionRole.attachInlinePolicy(RagAccessPolicy);
}
}

View file

@ -0,0 +1,6 @@
export * from './db';
export * from './ecr';
export * from './iam';
export * from './frontend';
export * from './backend';
export * from './network';

View file

@ -0,0 +1,143 @@
import { RemovalPolicy, Duration, CfnOutput } from 'aws-cdk-lib'
import { Construct } from 'constructs'
import {
aws_ec2 as ec2,
aws_ecs as ecs,
aws_logs as logs,
aws_servicediscovery as servicediscovery,
aws_elasticloadbalancingv2 as elb,
} from 'aws-cdk-lib';
export class Network extends Construct {
readonly vpc: ec2.Vpc;
readonly cluster: ecs.Cluster;
readonly alb: elb.IApplicationLoadBalancer;
readonly targetGroup: elb.ApplicationTargetGroup;
readonly cloudmapNamespace: servicediscovery.PrivateDnsNamespace;
readonly ecsFrontSG: ec2.SecurityGroup;
readonly ecsBackSG: ec2.SecurityGroup;
readonly dbSG: ec2.SecurityGroup;
readonly albSG: ec2.SecurityGroup;
readonly backendLogGroup: logs.LogGroup;
readonly frontendLogGroup: logs.LogGroup;
constructor(scope: Construct, id: string) {
super(scope, id)
const alb_listen_port=80
const front_service_port=3000
const back_service_port=7860
// VPC等リソースの作成
this.vpc = new ec2.Vpc(scope, 'VPC', {
vpcName: 'langflow-vpc',
ipAddresses: ec2.IpAddresses.cidr('10.0.0.0/16'),
maxAzs: 3,
subnetConfiguration: [
{
cidrMask: 24,
name: 'langflow-Isolated',
subnetType: ec2.SubnetType.PRIVATE_ISOLATED,
},
{
cidrMask: 24,
name: 'langflow-Public',
subnetType: ec2.SubnetType.PUBLIC,
},
{
cidrMask: 24,
name: 'langflow-Private',
subnetType: ec2.SubnetType.PRIVATE_WITH_EGRESS
},
],
natGateways: 1,
})
// Cluster
this.cluster = new ecs.Cluster(this, 'EcsCluster', {
clusterName: 'langflow-cluster',
vpc: this.vpc,
enableFargateCapacityProviders: true,
});
// Private DNS
this.cloudmapNamespace = new servicediscovery.PrivateDnsNamespace(
this,
'Namespace',
{
name: 'ecs-deploy.com',
vpc: this.vpc,
}
);
// ALBに設定するセキュリティグループ
this.albSG = new ec2.SecurityGroup(scope, 'ALBSecurityGroup', {
securityGroupName: 'alb-sg',
description: 'for alb',
vpc: this.vpc,
})
this.albSG.addIngressRule(ec2.Peer.anyIpv4(), ec2.Port.tcp(alb_listen_port))
this.alb = new elb.ApplicationLoadBalancer(this,'langflow-alb',{
internetFacing: true, //インターネットからのアクセスを許可するかどうか指定
loadBalancerName: 'langflow-alb',
securityGroup: this.albSG, //作成したセキュリティグループを割り当てる
vpc:this.vpc,
})
const listener = this.alb.addListener('Listener', { port: alb_listen_port });
this.targetGroup = listener.addTargets('targetGroup', {
port: front_service_port,
protocol: elb.ApplicationProtocol.HTTP,
healthCheck: {
enabled: true,
path: '/health',
healthyThresholdCount: 2,
unhealthyThresholdCount: 4,
interval: Duration.seconds(100),
timeout: Duration.seconds(30),
healthyHttpCodes: '200',
},
});
// ECS FrontEndに設定するセキュリティグループ
this.ecsFrontSG = new ec2.SecurityGroup(scope, 'ECSFrontEndSecurityGroup', {
securityGroupName: 'langflow-ecs-front-sg',
description: 'for langflow-front-ecs',
vpc: this.vpc,
})
this.ecsFrontSG.addIngressRule(this.albSG, ec2.Port.allTcp())
// ECS BackEndに設定するセキュリティグループ
this.ecsBackSG = new ec2.SecurityGroup(scope, 'ECSBackEndSecurityGroup', {
securityGroupName: 'langflow-ecs-back-sg',
description: 'for langflow-back-ecs',
vpc: this.vpc,
})
this.ecsBackSG.addIngressRule(this.ecsFrontSG, ec2.Port.tcp(back_service_port))
// RDSに設定するセキュリティグループ
this.dbSG = new ec2.SecurityGroup(scope, 'DBSecurityGroup', {
allowAllOutbound: true,
securityGroupName: 'langflow-db',
description: 'for langflow-db',
vpc: this.vpc,
})
// AppRunnerSecurityGroupからのポート3306:mysql(5432:postgres)のインバウンドを許可
this.dbSG.addIngressRule(this.ecsBackSG, ec2.Port.tcp(3306))
// Create CloudWatch Log Group
this.backendLogGroup = new logs.LogGroup(this, 'backendLogGroup', {
logGroupName: 'langflow-backend-logs',
removalPolicy: RemovalPolicy.DESTROY,
});
this.frontendLogGroup = new logs.LogGroup(this, 'frontendLogGroup', {
logGroupName: 'langflow-frontend-logs',
removalPolicy: RemovalPolicy.DESTROY,
});
new CfnOutput(this, 'URL', {
value: `http://${this.alb.loadBalancerDnsName}`,
});
}
}

4613
scripts/aws/package-lock.json generated Normal file

File diff suppressed because it is too large Load diff

29
scripts/aws/package.json Normal file
View file

@ -0,0 +1,29 @@
{
"name": "cdk",
"version": "0.1.0",
"bin": {
"cdk": "bin/cdk.js"
},
"scripts": {
"build": "tsc",
"watch": "tsc -w",
"test": "jest",
"cdk": "cdk"
},
"devDependencies": {
"@types/jest": "^29.5.1",
"@types/node": "20.1.7",
"aws-cdk": "2.86.0",
"jest": "^29.5.0",
"ts-jest": "^29.1.0",
"ts-node": "^10.9.1",
"typescript": "~5.1.3"
},
"dependencies": {
"aws-cdk-lib": "^2.86.0",
"cdk-ecr-deployment": "^2.5.30",
"constructs": "^10.0.0",
"dotenv": "^16.3.1",
"source-map-support": "^0.5.21"
}
}

View file

@ -0,0 +1,17 @@
// import * as cdk from 'aws-cdk-lib';
// import { Template } from 'aws-cdk-lib/assertions';
// import * as Cdk from '../lib/cdk-stack';
// example test. To run these tests, uncomment this file along with the
// example resource in lib/cdk-stack.ts
test('SQS Queue Created', () => {
// const app = new cdk.App();
// // WHEN
// const stack = new Cdk.CdkStack(app, 'MyTestStack');
// // THEN
// const template = Template.fromStack(stack);
// template.hasResourceProperties('AWS::SQS::Queue', {
// VisibilityTimeout: 300
// });
});

View file

@ -1,9 +1,10 @@
from importlib import metadata
from langflow.interface.custom.custom_component import CustomComponent
# Deactivate cache manager for now
# from langflow.services.cache import cache_service
from langflow.processing.process import load_flow_from_json
from langflow.interface.custom.custom_component import CustomComponent
from langflow.processing.load import load_flow_from_json
try:
__version__ = metadata.version(__package__)
@ -12,4 +13,4 @@ except metadata.PackageNotFoundError:
__version__ = ""
del metadata # optional, avoids polluting the results of dir(__package__)
__all__ = ["load_flow_from_json", "cache_service", "CustomComponent"]
__all__ = ["load_flow_from_json", "CustomComponent"]

View file

@ -8,8 +8,7 @@ Create Date: 2023-12-13 18:55:52.587360
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa # noqa: F401
import sqlmodel # noqa: F401
# revision identifiers, used by Alembic.
revision: str = '006b3990db50'
down_revision: Union[str, None] = '1ef9c4f3765d'
@ -19,27 +18,32 @@ depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('apikey', schema=None) as batch_op:
batch_op.create_unique_constraint('uq_apikey_id', ['id'])
try:
with op.batch_alter_table('apikey', schema=None) as batch_op:
batch_op.create_unique_constraint('uq_apikey_id', ['id'])
with op.batch_alter_table('flow', schema=None) as batch_op:
batch_op.create_unique_constraint('uq_flow_id', ['id'])
with op.batch_alter_table('flow', schema=None) as batch_op:
batch_op.create_unique_constraint('uq_flow_id', ['id'])
with op.batch_alter_table('user', schema=None) as batch_op:
batch_op.create_unique_constraint('uq_user_id', ['id'])
with op.batch_alter_table('user', schema=None) as batch_op:
batch_op.create_unique_constraint('uq_user_id', ['id'])
except Exception:
pass
# ### end Alembic commands ###
def downgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('user', schema=None) as batch_op:
batch_op.drop_constraint('uq_user_id', type_='unique')
try:
with op.batch_alter_table('user', schema=None) as batch_op:
batch_op.drop_constraint('uq_user_id', type_='unique')
with op.batch_alter_table('flow', schema=None) as batch_op:
batch_op.drop_constraint('uq_flow_id', type_='unique')
with op.batch_alter_table('apikey', schema=None) as batch_op:
batch_op.drop_constraint('uq_apikey_id', type_='unique')
with op.batch_alter_table('flow', schema=None) as batch_op:
batch_op.drop_constraint('uq_flow_id', type_='unique')
with op.batch_alter_table('apikey', schema=None) as batch_op:
batch_op.drop_constraint('uq_apikey_id', type_='unique')
except Exception:
pass
# ### end Alembic commands ###

View file

@ -2,9 +2,6 @@ import time
from fastapi import APIRouter, Depends, HTTPException, Query, WebSocket, WebSocketException, status
from fastapi.responses import StreamingResponse
from loguru import logger
from sqlmodel import Session
from langflow.api.utils import build_input_keys_response, format_elapsed_time
from langflow.api.v1.schemas import BuildStatus, BuiltResponse, InitResponse, StreamData
from langflow.graph.graph.base import Graph
@ -13,6 +10,8 @@ from langflow.services.cache.service import BaseCacheService
from langflow.services.cache.utils import update_build_status
from langflow.services.chat.service import ChatService
from langflow.services.deps import get_cache_service, get_chat_service, get_session
from loguru import logger
from sqlmodel import Session
router = APIRouter(tags=["Chat"])
@ -149,13 +148,12 @@ async def stream_build(
logger.debug("No user_id found in cache_service")
user_id = None
for i, vertex in enumerate(graph.generator_build(), 1):
start_time = time.perf_counter()
try:
log_dict = {
"log": f"Building node {vertex.vertex_type}",
}
yield str(StreamData(event="log", data=log_dict))
# time this
start_time = time.perf_counter()
if vertex.is_task:
vertex = await try_running_celery_task(vertex, user_id)
else:

View file

@ -1,5 +1,5 @@
from http import HTTPStatus
from typing import Annotated, Optional, Union
from typing import Annotated, Any, List, Optional, Union
import sqlalchemy as sa
from fastapi import APIRouter, Body, Depends, HTTPException, UploadFile, status
@ -16,7 +16,7 @@ from langflow.api.v1.schemas import (
)
from langflow.interface.custom.custom_component import CustomComponent
from langflow.interface.custom.directory_reader import DirectoryReader
from langflow.interface.types import build_custom_component_template, create_and_validate_component
from langflow.interface.custom.utils import build_custom_component_template
from langflow.processing.process import process_graph_cached, process_tweaks
from langflow.services.auth.utils import api_key_security, get_current_active_user
from langflow.services.cache.utils import save_uploaded_file
@ -40,6 +40,75 @@ from langflow.services.task.service import TaskService
router = APIRouter(tags=["Base"])
async def process_graph_data(
graph_data: dict,
inputs: Optional[Union[List[dict], dict]] = None,
tweaks: Optional[dict] = None,
clear_cache: bool = False,
session_id: Optional[str] = None,
task_service: "TaskService" = Depends(get_task_service),
sync: bool = True,
):
task_result: Any = None
task_status = None
if tweaks:
try:
graph_data = process_tweaks(graph_data, tweaks)
except Exception as exc:
logger.error(f"Error processing tweaks: {exc}")
if sync:
result = await process_graph_cached(
graph_data,
inputs,
clear_cache,
session_id,
)
task_id = str(id(result))
if isinstance(result, dict) and "result" in result:
task_result = result["result"]
session_id = result["session_id"]
elif hasattr(result, "result") and hasattr(result, "session_id"):
task_result = result.result
session_id = result.session_id
else:
task_result = result
else:
logger.warning(
"This is an experimental feature and may not work as expected."
"Please report any issues to our GitHub repository."
)
if session_id is None:
# Generate a session ID
session_id = get_session_service().generate_key(session_id=session_id, data_graph=graph_data)
task_id, task = await task_service.launch_task(
process_graph_cached_task if task_service.use_celery else process_graph_cached,
graph_data,
inputs,
clear_cache,
session_id,
)
task_status = task.status
if task.status == "FAILURE":
logger.error(f"Task {task_id} failed: {task.traceback}")
task_result = str(task._exception)
else:
task_result = task.result
if task_id:
task_response = TaskResponse(id=task_id, href=f"api/v1/task/{task_id}")
else:
task_response = None
return ProcessResponse(
result=task_result,
status=task_status,
task=task_response,
session_id=session_id,
backend=task_service.backend_name,
)
@router.get("/all", dependencies=[Depends(get_current_active_user)])
def get_all(
settings_service=Depends(get_settings_service),
@ -53,7 +122,32 @@ def get_all(
raise HTTPException(status_code=500, detail=str(exc)) from exc
# For backwards compatibility we will keep the old endpoint
@router.post("/process/json", response_model=ProcessResponse)
async def process_json(
session: Annotated[Session, Depends(get_session)],
data: dict,
inputs: Optional[dict] = None,
tweaks: Optional[dict] = None,
clear_cache: Annotated[bool, Body(embed=True)] = False, # noqa: F821
session_id: Annotated[Union[None, str], Body(embed=True)] = None, # noqa: F821
task_service: "TaskService" = Depends(get_task_service),
sync: Annotated[bool, Body(embed=True)] = True, # noqa: F821
):
try:
return await process_graph_data(
graph_data=data,
inputs=inputs,
tweaks=tweaks,
clear_cache=clear_cache,
session_id=session_id,
task_service=task_service,
sync=sync,
)
except Exception as exc:
logger.exception(exc)
raise HTTPException(status_code=500, detail=str(exc)) from exc
@router.post(
"/predict/{flow_id}",
response_model=ProcessResponse,
@ -66,7 +160,7 @@ def get_all(
async def process(
session: Annotated[Session, Depends(get_session)],
flow_id: str,
inputs: Optional[dict] = None,
inputs: Optional[Union[List[dict], dict]] = None,
tweaks: Optional[dict] = None,
clear_cache: Annotated[bool, Body(embed=True)] = False, # noqa: F821
session_id: Annotated[Union[None, str], Body(embed=True)] = None, # noqa: F821
@ -94,54 +188,14 @@ async def process(
if flow.data is None:
raise ValueError(f"Flow {flow_id} has no data")
graph_data = flow.data
task_result = None
if tweaks:
try:
graph_data = process_tweaks(graph_data, tweaks)
except Exception as exc:
logger.error(f"Error processing tweaks: {exc}")
if sync:
task_id, result = await task_service.launch_and_await_task(
process_graph_cached_task if task_service.use_celery else process_graph_cached,
graph_data,
inputs,
clear_cache,
session_id,
)
if isinstance(result, dict) and "result" in result:
task_result = result["result"]
session_id = result["session_id"]
elif hasattr(result, "result") and hasattr(result, "session_id"):
task_result = result.result
session_id = result.session_id
else:
logger.warning(
"This is an experimental feature and may not work as expected."
"Please report any issues to our GitHub repository."
)
if session_id is None:
# Generate a session ID
session_id = get_session_service().generate_key(session_id=session_id, data_graph=graph_data)
task_id, task = await task_service.launch_task(
process_graph_cached_task if task_service.use_celery else process_graph_cached,
graph_data,
inputs,
clear_cache,
session_id,
)
task_result = task.status
if task_id:
task_response = TaskResponse(id=task_id, href=f"api/v1/task/{task_id}")
else:
task_response = None
return ProcessResponse(
result=task_result,
task=task_response,
return await process_graph_data(
graph_data=graph_data,
inputs=inputs,
tweaks=tweaks,
clear_cache=clear_cache,
session_id=session_id,
backend=task_service.backend_name,
task_service=task_service,
sync=sync,
)
except sa.exc.StatementError as exc:
# StatementError('(builtins.ValueError) badly formed hexadecimal UUID string')
@ -164,6 +218,8 @@ async def get_task_status(task_id: str):
task_service = get_task_service()
task = task_service.get_task(task_id)
result = None
if task is None:
raise HTTPException(status_code=404, detail="Task not found")
if task.ready():
result = task.result
# If result isinstance of Exception, can we get the traceback?
@ -175,8 +231,6 @@ async def get_task_status(task_id: str):
elif hasattr(result, "result"):
result = result.result
if task is None:
raise HTTPException(status_code=404, detail="Task not found")
if task.status == "FAILURE":
result = str(task.result)
logger.error(f"Task {task_id} failed: {task.traceback}")
@ -216,7 +270,7 @@ async def custom_component(
raw_code: CustomComponentCode,
user: User = Depends(get_current_active_user),
):
component = create_and_validate_component(raw_code.code)
component = CustomComponent(code=raw_code.code)
built_frontend_node = build_custom_component_template(component, user_id=user.id)
@ -226,7 +280,7 @@ async def custom_component(
@router.post("/custom_component/reload", status_code=HTTPStatus.OK)
async def reload_custom_component(path: str, user: User = Depends(get_current_active_user)):
from langflow.interface.types import build_custom_component_template
from langflow.interface.custom.utils import build_custom_component_template
try:
reader = DirectoryReader("")
@ -235,7 +289,6 @@ async def reload_custom_component(path: str, user: User = Depends(get_current_ac
raise ValueError(content)
extractor = CustomComponent(code=content)
extractor.validate()
return build_custom_component_template(extractor, user_id=user.id)
except Exception as exc:
raise HTTPException(status_code=400, detail=str(exc))
@ -246,7 +299,7 @@ async def custom_component_update(
raw_code: CustomComponentCode,
user: User = Depends(get_current_active_user),
):
component = create_and_validate_component(raw_code.code)
component = CustomComponent(code=raw_code.code)
component_node = build_custom_component_template(component, user_id=user.id, update_field=raw_code.field)
# Update the field

View file

@ -3,12 +3,11 @@ from pathlib import Path
from typing import Any, Dict, List, Optional, Union
from uuid import UUID
from pydantic import BaseModel, Field, field_validator
from langflow.services.database.models.api_key.model import ApiKeyRead
from langflow.services.database.models.base import orjson_dumps
from langflow.services.database.models.flow import FlowCreate, FlowRead
from langflow.services.database.models.user import UserRead
from pydantic import BaseModel, Field, field_validator
class BuildStatus(Enum):
@ -59,6 +58,7 @@ class ProcessResponse(BaseModel):
"""Process response schema."""
result: Any
status: Optional[str] = None
task: Optional[TaskResponse] = None
session_id: Optional[str] = None
backend: Optional[str] = None

View file

@ -1,7 +1,6 @@
from typing import Callable, List, Union
from typing import Callable, List, Optional, Union
from langchain.agents import AgentExecutor, AgentType, initialize_agent, types
from langflow import CustomComponent
from langflow.field_typing import BaseChatMemory, BaseLanguageModel, Tool
@ -20,18 +19,34 @@ class AgentInitializerComponent(CustomComponent):
"memory": {"display_name": "Memory"},
"tools": {"display_name": "Tools"},
"llm": {"display_name": "Language Model"},
"code": {"advanced": True},
}
def build(
self, agent: str, llm: BaseLanguageModel, memory: BaseChatMemory, tools: List[Tool], max_iterations: int
self,
agent: str,
llm: BaseLanguageModel,
tools: List[Tool],
max_iterations: int,
memory: Optional[BaseChatMemory] = None,
) -> Union[AgentExecutor, Callable]:
agent = AgentType(agent)
return initialize_agent(
tools=tools,
llm=llm,
agent=agent,
memory=memory,
return_intermediate_steps=True,
handle_parsing_errors=True,
max_iterations=max_iterations,
)
if memory:
return initialize_agent(
tools=tools,
llm=llm,
agent=agent,
memory=memory,
return_intermediate_steps=True,
handle_parsing_errors=True,
max_iterations=max_iterations,
)
else:
return initialize_agent(
tools=tools,
llm=llm,
agent=agent,
return_intermediate_steps=True,
handle_parsing_errors=True,
max_iterations=max_iterations,
)

View file

@ -9,7 +9,9 @@ from langchain.prompts import SystemMessagePromptTemplate
from langchain.prompts.chat import MessagesPlaceholder
from langchain.schema.memory import BaseMemory
from langchain.tools import Tool
from langflow import CustomComponent
from langflow.field_typing.range_spec import RangeSpec
class ConversationalAgent(CustomComponent):
@ -35,6 +37,11 @@ class ConversationalAgent(CustomComponent):
"value": openai_function_models[0],
},
"code": {"show": False},
"temperature": {
"display_name": "Temperature",
"value": 0.2,
"range_spec": RangeSpec(min=0, max=2, step=0.1),
},
}
def build(
@ -46,11 +53,14 @@ class ConversationalAgent(CustomComponent):
memory: Optional[BaseMemory] = None,
system_message: Optional[SystemMessagePromptTemplate] = None,
max_token_limit: int = 2000,
temperature: float = 0.9,
) -> AgentExecutor:
llm = ChatOpenAI(
model=model_name,
api_key=openai_api_key,
base_url=openai_api_base,
max_tokens=max_token_limit,
temperature=temperature,
)
if not memory:
memory_key = "chat_history"

View file

@ -0,0 +1,79 @@
from typing import Optional
from langflow import CustomComponent
from langchain.llms.base import BaseLanguageModel
from langchain.chat_models.azure_openai import AzureChatOpenAI
class AzureChatOpenAIComponent(CustomComponent):
display_name: str = "AzureChatOpenAI"
description: str = "LLM model from Azure OpenAI."
documentation: str = "https://python.langchain.com/docs/integrations/llms/azure_openai"
AZURE_OPENAI_MODELS = [
"gpt-35-turbo",
"gpt-35-turbo-16k",
"gpt-35-turbo-instruct",
"gpt-4",
"gpt-4-32k",
"gpt-4-vision",
]
def build_config(self):
return {
"model": {
"display_name": "Model Name",
"value": "gpt-35-turbo",
"options": self.AZURE_OPENAI_MODELS,
"required": True,
},
"azure_endpoint": {
"display_name": "Azure Endpoint",
"required": True,
"info": "Your Azure endpoint, including the resource.. Example: `https://example-resource.azure.openai.com/`",
},
"azure_deployment": {
"display_name": "Deployment Name",
"required": True,
},
"api_version": {
"display_name": "API Version",
"value": "2023-05-15",
"required": True,
"advanced": True,
},
"api_key": {"display_name": "API Key", "required": True, "password": True},
"temperature": {
"display_name": "Temperature",
"value": 0.7,
"field_type": "float",
"required": False,
},
"max_tokens": {
"display_name": "Max Tokens",
"value": 1000,
"required": False,
"field_type": "int",
"advanced": True,
},
"code": {"show": False},
}
def build(
self,
model: str,
azure_endpoint: str,
azure_deployment: str,
api_key: str,
api_version: str = "2023-05-15",
temperature: float = 0.7,
max_tokens: Optional[int] = 1000,
) -> BaseLanguageModel:
return AzureChatOpenAI(
model=model,
azure_endpoint=azure_endpoint,
azure_deployment=azure_deployment,
api_version=api_version,
api_key=api_key,
temperature=temperature,
max_tokens=max_tokens,
)

View file

@ -0,0 +1,72 @@
from typing import Optional
from langchain_google_genai import ChatGoogleGenerativeAI # type: ignore
from langflow import CustomComponent
from langflow.field_typing import BaseLanguageModel, RangeSpec, TemplateField
class GoogleGenerativeAIComponent(CustomComponent):
display_name: str = "Google Generative AI"
description: str = "A component that uses Google Generative AI to generate text."
documentation: str = "http://docs.langflow.org/components/custom"
def build_config(self):
return {
"google_api_key": TemplateField(
display_name="Google API Key",
info="The Google API Key to use for the Google Generative AI.",
),
"max_output_tokens": TemplateField(
display_name="Max Output Tokens",
info="The maximum number of tokens to generate.",
),
"temperature": TemplateField(
display_name="Temperature",
info="Run inference with this temperature. Must by in the closed interval [0.0, 1.0].",
),
"top_k": TemplateField(
display_name="Top K",
info="Decode using top-k sampling: consider the set of top_k most probable tokens. Must be positive.",
range_spec=RangeSpec(min=0, max=2, step=0.1),
advanced=True,
),
"top_p": TemplateField(
display_name="Top P",
info="The maximum cumulative probability of tokens to consider when sampling.",
advanced=True,
),
"n": TemplateField(
display_name="N",
info="Number of chat completions to generate for each prompt. Note that the API may not return the full n completions if duplicates are generated.",
advanced=True,
),
"model": TemplateField(
display_name="Model",
info="The name of the model to use. Supported examples: gemini-pro",
options=["gemini-pro", "gemini-pro-vision"],
),
"code": TemplateField(
advanced=True,
),
}
def build(
self,
google_api_key: str,
model: str,
max_output_tokens: Optional[int] = None,
temperature: float = 0.1,
top_k: Optional[int] = None,
top_p: Optional[float] = None,
n: Optional[int] = 1,
) -> BaseLanguageModel:
return ChatGoogleGenerativeAI(
model=model,
max_output_tokens=max_output_tokens or None,
temperature=temperature,
top_k=top_k or None,
top_p=top_p or None,
n=n or 1,
google_api_key=google_api_key,
)

View file

@ -0,0 +1,90 @@
import weaviate # type: ignore
from typing import Optional, Union
from langflow import CustomComponent
from langchain.vectorstores import Weaviate
from langchain.schema import Document
from langchain.vectorstores.base import VectorStore
from langchain.schema import BaseRetriever
from langchain.embeddings.base import Embeddings
class WeaviateVectorStore(CustomComponent):
display_name: str = "Weaviate"
description: str = "Implementation of Vector Store using Weaviate"
documentation = "https://python.langchain.com/docs/integrations/vectorstores/weaviate"
beta = True
field_config = {
"url": {"display_name": "Weaviate URL", "value": "http://localhost:8080"},
"api_key": {
"display_name": "API Key",
"password": True,
"required": False,
},
"index_name": {
"display_name": "Index name",
"required": False,
},
"text_key": {"display_name": "Text Key", "required": False, "advanced": True, "value": "text"},
"documents": {"display_name": "Documents", "is_list": True},
"embedding": {"display_name": "Embedding"},
"attributes": {
"display_name": "Attributes",
"required": False,
"is_list": True,
"field_type": "str",
"advanced": True,
},
"search_by_text": {"display_name": "Search By Text", "field_type": "bool", "advanced": True},
"code": {"show": False},
}
def build(
self,
url: str,
search_by_text: bool = False,
api_key: Optional[str] = None,
index_name: Optional[str] = None,
text_key: Optional[str] = "text",
embedding: Optional[Embeddings] = None,
documents: Optional[Document] = None,
attributes: Optional[list] = None,
) -> Union[VectorStore, BaseRetriever]:
if api_key:
auth_config = weaviate.AuthApiKey(api_key=api_key)
client = weaviate.Client(url=url, auth_client_secret=auth_config)
else:
client = weaviate.Client(url=url)
def _to_pascal_case(word: str):
if word and not word[0].isupper():
word = word.capitalize()
if word.isidentifier():
return word
word = word.replace("-", " ").replace("_", " ")
parts = word.split()
pascal_case_word = "".join([part.capitalize() for part in parts])
return pascal_case_word
index_name = _to_pascal_case(index_name) if index_name else None
if documents is not None and embedding is not None:
return Weaviate.from_documents(
client=client,
index_name=index_name,
documents=documents,
embedding=embedding,
by_text=search_by_text,
)
return Weaviate(
client=client,
index_name=index_name,
text_key=text_key,
embedding=embedding,
by_text=search_by_text,
attributes=attributes if attributes is not None else [],
)

View file

@ -270,8 +270,6 @@ vectorstores:
# documentation: "https://python.langchain.com/docs/modules/data_connection/vectorstores/integrations/chroma"
Qdrant:
documentation: "https://python.langchain.com/docs/modules/data_connection/vectorstores/integrations/qdrant"
Weaviate:
documentation: "https://python.langchain.com/docs/modules/data_connection/vectorstores/integrations/weaviate"
FAISS:
documentation: "https://python.langchain.com/docs/modules/data_connection/vectorstores/integrations/faiss"
Pinecone:

View file

@ -13,7 +13,6 @@ CUSTOM_NODES = {
"agents": {
"JsonAgent": frontend_node.agents.JsonAgentNode(),
"CSVAgent": frontend_node.agents.CSVAgentNode(),
"AgentInitializer": frontend_node.agents.InitializeAgentNode(),
"VectorStoreAgent": frontend_node.agents.VectorStoreAgentNode(),
"VectorStoreRouterAgent": frontend_node.agents.VectorStoreRouterAgentNode(),
"SQLAgent": frontend_node.agents.SQLAgentNode(),

View file

@ -1,6 +1,6 @@
from typing import Any, List, Optional
from typing import Any, Optional
from langchain.agents import AgentExecutor, AgentType, Tool, ZeroShotAgent, initialize_agent
from langchain.agents import AgentExecutor, ZeroShotAgent
from langchain.agents.agent_toolkits import (
SQLDatabaseToolkit,
VectorStoreInfo,
@ -15,7 +15,6 @@ from langchain.agents.agent_toolkits.vectorstore.prompt import ROUTER_PREFIX as
from langchain.agents.mrkl.prompt import FORMAT_INSTRUCTIONS
from langchain.base_language import BaseLanguageModel
from langchain.chains.llm import LLMChain
from langchain.memory.chat_memory import BaseChatMemory
from langchain.sql_database import SQLDatabase
from langchain.tools.sql_database.prompt import QUERY_CHECKER
from langchain_experimental.agents.agent_toolkits.pandas.prompt import PREFIX as PANDAS_PREFIX
@ -264,45 +263,9 @@ class VectorStoreRouterAgent(CustomAgentExecutor):
return super().run(*args, **kwargs)
class InitializeAgent(CustomAgentExecutor):
"""Implementation of AgentInitializer function"""
@staticmethod
def function_name():
return "AgentInitializer"
@classmethod
def initialize(
cls,
llm: BaseLanguageModel,
tools: List[Tool],
agent: str,
memory: Optional[BaseChatMemory] = None,
):
# Find which value in the AgentType enum corresponds to the string
# passed in as agent
agent = AgentType(agent)
return initialize_agent(
tools=tools,
llm=llm,
# LangChain now uses Enum for agent, but we still support string
agent=agent, # type: ignore
memory=memory,
return_intermediate_steps=True,
handle_parsing_errors=True,
)
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
def run(self, *args, **kwargs):
return super().run(*args, **kwargs)
CUSTOM_AGENTS = {
"JsonAgent": JsonAgent,
"CSVAgent": CSVAgent,
"AgentInitializer": InitializeAgent,
"VectorStoreAgent": VectorStoreAgent,
"VectorStoreRouterAgent": VectorStoreRouterAgent,
"SQLAgent": SQLAgent,

View file

@ -0,0 +1,3 @@
from .code_parser import CodeParser
__all__ = ["CodeParser"]

View file

@ -6,7 +6,6 @@ from typing import Any, Dict, List, Type, Union
from cachetools import TTLCache, cachedmethod, keys
from fastapi import HTTPException
from langflow.interface.custom.schema import CallableCodeDetails, ClassCodeDetails
@ -57,7 +56,7 @@ class CodeParser:
ast.Assign: self.parse_global_vars,
}
def __get_tree(self):
def get_tree(self):
"""
Parses the provided code to validate its syntax.
It tries to parse the code into an abstract syntax tree (AST).
@ -313,7 +312,7 @@ class CodeParser:
"""
Runs all parsing operations and returns the resulting data.
"""
tree = self.__get_tree()
tree = self.get_tree()
for node in ast.walk(tree):
self.parse_node(node)

View file

@ -0,0 +1,39 @@
import re
from types import GenericAlias
from typing import Any
def extract_inner_type(return_type: str) -> str:
"""
Extracts the inner type from a type hint that is a list.
"""
if match := re.match(r"list\[(.*)\]", return_type, re.IGNORECASE):
return match[1]
return return_type
def extract_inner_type_from_generic_alias(return_type: GenericAlias) -> Any:
"""
Extracts the inner type from a type hint that is a list.
"""
if return_type.__origin__ == list:
return list(return_type.__args__)
return return_type
def extract_union_types(return_type: str) -> list[str]:
"""
Extracts the inner type from a type hint that is a list.
"""
# If the return type is a Union, then we need to parse it
return_type = return_type.replace("Union", "").replace("[", "").replace("]", "")
return_types = return_type.split(",")
return [item.strip() for item in return_types]
def extract_union_types_from_generic_alias(return_type: GenericAlias) -> list:
"""
Extracts the inner type from a type hint that is a Union.
"""
return list(return_type.__args__)

View file

@ -0,0 +1,3 @@
from .custom_component import CustomComponent
__all__ = ["CustomComponent"]

View file

@ -6,9 +6,7 @@ import yaml
from cachetools import TTLCache, cachedmethod
from fastapi import HTTPException
from langflow.interface.custom.component import Component
from langflow.interface.custom.directory_reader import DirectoryReader
from langflow.interface.custom.utils import (
from langflow.interface.custom.code_parser.utils import (
extract_inner_type_from_generic_alias,
extract_union_types_from_generic_alias,
)
@ -17,6 +15,8 @@ from langflow.services.database.utils import session_getter
from langflow.services.deps import get_credential_service, get_db_service
from langflow.utils import validate
from .component import Component
class CustomComponent(Component):
display_name: Optional[str] = None
@ -47,37 +47,9 @@ class CustomComponent(Component):
def build_config(self):
return self.field_config
def _class_template_validation(self, code: str):
TYPE_HINT_LIST = ["Optional", "Prompt", "PromptTemplate", "LLMChain"]
if not code:
raise HTTPException(
status_code=400,
detail={
"error": self.ERROR_CODE_NULL,
"traceback": "",
},
)
reader = DirectoryReader("", False)
for type_hint in TYPE_HINT_LIST:
if reader._is_type_hint_used_in_args(type_hint, code) and not reader._is_type_hint_imported(
type_hint, code
):
error_detail = {
"error": "Type hint Error",
"traceback": f"Type hint '{type_hint}' is used but not imported in the code.",
}
raise HTTPException(status_code=400, detail=error_detail)
return True
def validate(self) -> bool:
return self._class_template_validation(self.code) if self.code else False
@property
def tree(self):
return self.get_code_tree(self.code)
return self.get_code_tree(self.code or "")
@property
def get_function_entrypoint_args(self) -> list:
@ -105,11 +77,11 @@ class CustomComponent(Component):
@cachedmethod(operator.attrgetter("cache"))
def get_build_method(self):
if not self.code:
return []
return {}
component_classes = [cls for cls in self.tree["classes"] if self.code_class_base_inheritance in cls["bases"]]
if not component_classes:
return []
return {}
# Assume the first Component class is the one we're interested in
component_class = component_classes[0]
@ -117,19 +89,13 @@ class CustomComponent(Component):
method for method in component_class["methods"] if method["name"] == self.function_entrypoint_name
]
if not build_methods:
return []
return build_methods[0]
return build_methods[0] if build_methods else {}
@property
def get_function_entrypoint_return_type(self) -> List[Any]:
build_method = self.get_build_method()
if not build_method:
if not build_method or not build_method.get("has_return"):
return []
elif not build_method["has_return"]:
return []
return_type = build_method["return_type"]
# If list or List is in the return type, then we remove it and return the inner type
@ -138,10 +104,7 @@ class CustomComponent(Component):
# If the return type is not a Union, then we just return it as a list
if not hasattr(return_type, "__origin__") or return_type.__origin__ != Union:
if isinstance(return_type, list):
return return_type
return [return_type]
return return_type if isinstance(return_type, list) else [return_type]
# If the return type is a Union, then we need to parse itx
return_type = extract_union_types_from_generic_alias(return_type)
return return_type
@ -207,9 +170,7 @@ class CustomComponent(Component):
"""Returns a function that returns the value at the given index in the iterable."""
def get_index(iterable: List[Any]):
if iterable:
return iterable[value]
return iterable
return iterable[value] if iterable else iterable
return get_index

View file

@ -0,0 +1,3 @@
from .directory_reader import DirectoryReader
__all__ = ["DirectoryReader"]

View file

@ -1,6 +1,7 @@
import os
import ast
import os
import zlib
from loguru import logger
@ -63,12 +64,14 @@ class DirectoryReader:
return len(file_content.strip()) == 0
def filter_loaded_components(self, data: dict, with_errors: bool) -> dict:
from langflow.interface.custom.utils import build_component
items = [
{
"name": menu["name"],
"path": menu["path"],
"components": [
component
(*build_component(component), component)
for component in menu["components"]
if (component["error"] if with_errors else not component["error"])
],

View file

@ -0,0 +1,145 @@
from langflow.interface.custom.directory_reader import DirectoryReader
from langflow.template.frontend_node.custom_components import CustomComponentFrontendNode
from loguru import logger
def merge_nested_dicts_with_renaming(dict1, dict2):
for key, value in dict2.items():
if key in dict1 and isinstance(value, dict) and isinstance(dict1.get(key), dict):
for sub_key, sub_value in value.items():
# if sub_key in dict1[key]:
# new_key = get_new_key(dict1[key], sub_key)
# dict1[key][new_key] = sub_value
# else:
dict1[key][sub_key] = sub_value
else:
dict1[key] = value
return dict1
def build_invalid_menu(invalid_components):
"""Build the invalid menu."""
if not invalid_components.get("menu"):
return {}
logger.debug("------------------- INVALID COMPONENTS -------------------")
invalid_menu = {}
for menu_item in invalid_components["menu"]:
menu_name = menu_item["name"]
invalid_menu[menu_name] = build_invalid_menu_items(menu_item)
return invalid_menu
def build_valid_menu(valid_components):
"""Build the valid menu."""
valid_menu = {}
logger.debug("------------------- VALID COMPONENTS -------------------")
for menu_item in valid_components["menu"]:
menu_name = menu_item["name"]
valid_menu[menu_name] = build_menu_items(menu_item)
return valid_menu
def build_and_validate_all_files(reader: DirectoryReader, file_list):
"""Build and validate all files"""
data = reader.build_component_menu_list(file_list)
valid_components = reader.filter_loaded_components(data=data, with_errors=False)
invalid_components = reader.filter_loaded_components(data=data, with_errors=True)
return valid_components, invalid_components
def load_files_from_path(path: str):
"""Load all files from a given path"""
reader = DirectoryReader(path, False)
return reader.get_files()
def build_custom_component_list_from_path(path: str):
"""Build a list of custom components for the langchain from a given path"""
file_list = load_files_from_path(path)
reader = DirectoryReader(path, False)
valid_components, invalid_components = build_and_validate_all_files(reader, file_list)
valid_menu = build_valid_menu(valid_components)
invalid_menu = build_invalid_menu(invalid_components)
return merge_nested_dicts_with_renaming(valid_menu, invalid_menu)
def create_invalid_component_template(component, component_name):
"""Create a template for an invalid component."""
component_code = component["code"]
component_frontend_node = CustomComponentFrontendNode(
description="ERROR - Check your Python Code",
display_name=f"ERROR - {component_name}",
)
component_frontend_node.error = component.get("error", None)
field = component_frontend_node.template.get_field("code")
field.value = component_code
component_frontend_node.template.update_field("code", field)
return component_frontend_node.model_dump(by_alias=True, exclude_none=True)
def log_invalid_component_details(component):
"""Log details of an invalid component."""
logger.debug(component)
logger.debug(f"Component Path: {component.get('path', None)}")
logger.debug(f"Component Error: {component.get('error', None)}")
def build_invalid_component(component):
"""Build a single invalid component."""
component_name = component["name"]
component_template = create_invalid_component_template(component, component_name)
log_invalid_component_details(component)
return component_name, component_template
def build_invalid_menu_items(menu_item):
"""Build invalid menu items for a given menu."""
menu_items = {}
for component in menu_item["components"]:
try:
component_name, component_template = build_invalid_component(component)
menu_items[component_name] = component_template
logger.debug(f"Added {component_name} to invalid menu.")
except Exception as exc:
logger.exception(f"Error while creating custom component [{component_name}]: {str(exc)}")
return menu_items
def get_new_key(dictionary, original_key):
counter = 1
new_key = original_key + " (" + str(counter) + ")"
while new_key in dictionary:
counter += 1
new_key = original_key + " (" + str(counter) + ")"
return new_key
def determine_component_name(component):
"""Determine the name of the component."""
component_output_types = component["output_types"]
if len(component_output_types) == 1:
return component_output_types[0]
else:
file_name = component.get("file").split(".")[0]
return "".join(word.capitalize() for word in file_name.split("_")) if "_" in file_name else file_name
def build_menu_items(menu_item):
"""Build menu items for a given menu."""
menu_items = {}
for component_name, component_template, component in menu_item["components"]:
try:
menu_items[component_name] = component_template
logger.debug(f"Added {component_name} to valid menu.")
except Exception as exc:
logger.error(f"Error loading Component: {component['output_types']}")
logger.exception(f"Error while building custom component {component['output_types']}: {exc}")
return menu_items

View file

@ -1,40 +1,373 @@
import ast
import contextlib
import re
from types import GenericAlias
from typing import Any
import traceback
import warnings
from typing import Any, Dict, List, Optional, Union
from uuid import UUID
from fastapi import HTTPException
from loguru import logger
from langflow.field_typing.range_spec import RangeSpec
from langflow.interface.custom.code_parser.utils import extract_inner_type
from langflow.interface.custom.custom_component import CustomComponent
from langflow.interface.custom.directory_reader.utils import (
build_custom_component_list_from_path,
determine_component_name,
merge_nested_dicts_with_renaming,
)
from langflow.interface.importing.utils import eval_custom_component_code
from langflow.template.field.base import TemplateField
from langflow.template.frontend_node.custom_components import CustomComponentFrontendNode
from langflow.utils.util import get_base_classes
def extract_inner_type(return_type: str) -> str:
def add_output_types(frontend_node: CustomComponentFrontendNode, return_types: List[str]):
"""Add output types to the frontend node"""
for return_type in return_types:
if return_type is None:
raise HTTPException(
status_code=400,
detail={
"error": ("Invalid return type. Please check your code and try again."),
"traceback": traceback.format_exc(),
},
)
if hasattr(return_type, "__name__"):
return_type = return_type.__name__
elif hasattr(return_type, "__class__"):
return_type = return_type.__class__.__name__
else:
return_type = str(return_type)
frontend_node.add_output_type(return_type)
def add_base_classes(frontend_node: CustomComponentFrontendNode, return_types: List[str]):
"""Add base classes to the frontend node"""
for return_type_instance in return_types:
if return_type_instance is None:
raise HTTPException(
status_code=400,
detail={
"error": ("Invalid return type. Please check your code and try again."),
"traceback": traceback.format_exc(),
},
)
base_classes = get_base_classes(return_type_instance)
for base_class in base_classes:
frontend_node.add_base_class(base_class)
def extract_type_from_optional(field_type):
"""
Extracts the inner type from a type hint that is a list.
Extract the type from a string formatted as "Optional[<type>]".
Parameters:
field_type (str): The string from which to extract the type.
Returns:
str: The extracted type, or an empty string if no type was found.
"""
if match := re.match(r"list\[(.*)\]", return_type, re.IGNORECASE):
return match[1]
return return_type
match = re.search(r"\[(.*?)\]$", field_type)
return match[1] if match else None
def extract_inner_type_from_generic_alias(return_type: GenericAlias) -> Any:
"""
Extracts the inner type from a type hint that is a list.
"""
if return_type.__origin__ == list:
return list(return_type.__args__)
def get_field_properties(extra_field):
"""Get the properties of an extra field"""
field_name = extra_field["name"]
field_type = extra_field.get("type", "str")
field_value = extra_field.get("default", "")
field_required = "optional" not in field_type.lower()
return return_type
if not field_required:
field_type = extract_type_from_optional(field_type)
if field_value is not None:
with contextlib.suppress(Exception):
field_value = ast.literal_eval(field_value)
return field_name, field_type, field_value, field_required
def extract_union_types_from_generic_alias(return_type: GenericAlias) -> list:
"""
Extracts the inner type from a type hint that is a Union.
"""
return list(return_type.__args__)
def process_type(field_type: str):
if field_type.startswith("list") or field_type.startswith("List"):
return extract_inner_type(field_type)
return "prompt" if field_type == "Prompt" else field_type
def extract_union_types(return_type: str) -> list[str]:
"""
Extracts the inner type from a type hint that is a list.
"""
# If the return type is a Union, then we need to parse it
return_type = return_type.replace("Union", "").replace("[", "").replace("]", "")
return_types = return_type.split(",")
return_types = [item.strip() for item in return_types]
return return_types
def add_new_custom_field(
frontend_node: CustomComponentFrontendNode,
field_name: str,
field_type: str,
field_value: Any,
field_required: bool,
field_config: dict,
):
# Check field_config if any of the keys are in it
# if it is, update the value
display_name = field_config.pop("display_name", field_name)
field_type = field_config.pop("field_type", field_type)
field_contains_list = "list" in field_type.lower()
field_type = process_type(field_type)
field_value = field_config.pop("value", field_value)
field_advanced = field_config.pop("advanced", False)
if field_type == "bool" and field_value is None:
field_value = False
# If options is a list, then it's a dropdown
# If options is None, then it's a list of strings
is_list = isinstance(field_config.get("options"), list)
field_config["is_list"] = is_list or field_config.get("is_list", False) or field_contains_list
if "name" in field_config:
warnings.warn("The 'name' key in field_config is used to build the object and can't be changed.")
required = field_config.pop("required", field_required)
placeholder = field_config.pop("placeholder", "")
new_field = TemplateField(
name=field_name,
field_type=field_type,
value=field_value,
show=True,
required=required,
advanced=field_advanced,
placeholder=placeholder,
display_name=display_name,
**sanitize_field_config(field_config),
)
frontend_node.template.upsert_field(field_name, new_field)
if isinstance(frontend_node.custom_fields, dict):
frontend_node.custom_fields[field_name] = None
return frontend_node
def add_extra_fields(frontend_node, field_config, function_args):
"""Add extra fields to the frontend node"""
if not function_args:
return
# sort function_args which is a list of dicts
function_args.sort(key=lambda x: x["name"])
for extra_field in function_args:
if "name" not in extra_field or extra_field["name"] == "self":
continue
field_name, field_type, field_value, field_required = get_field_properties(extra_field)
config = field_config.get(field_name, {})
frontend_node = add_new_custom_field(
frontend_node,
field_name,
field_type,
field_value,
field_required,
config,
)
def get_field_dict(field: Union[TemplateField, dict]):
"""Get the field dictionary from a TemplateField or a dict"""
if isinstance(field, TemplateField):
return field.model_dump(by_alias=True, exclude_none=True)
return field
def run_build_config(custom_component: CustomComponent, user_id: Optional[Union[str, UUID]] = None, update_field=None):
"""Build the field configuration for a custom component"""
try:
if custom_component.code is None:
return {}
elif isinstance(custom_component.code, str):
custom_class = eval_custom_component_code(custom_component.code)
else:
raise ValueError("Invalid code type")
except Exception as exc:
logger.error(f"Error while evaluating custom component code: {str(exc)}")
raise HTTPException(
status_code=400,
detail={
"error": ("Invalid type convertion. Please check your code and try again."),
"traceback": traceback.format_exc(),
},
) from exc
try:
build_config: Dict = custom_class(user_id=user_id).build_config()
for field_name, field in build_config.items():
# Allow user to build TemplateField as well
# as a dict with the same keys as TemplateField
field_dict = get_field_dict(field)
if update_field is not None and field_name != update_field:
continue
try:
update_field_dict(field_dict)
build_config[field_name] = field_dict
except Exception as exc:
logger.error(f"Error while getting build_config: {str(exc)}")
return build_config
except Exception as exc:
logger.error(f"Error while building field config: {str(exc)}")
raise HTTPException(
status_code=400,
detail={
"error": ("Invalid type convertion. Please check your code and try again."),
"traceback": traceback.format_exc(),
},
) from exc
def sanitize_template_config(template_config):
"""Sanitize the template config"""
attributes = {
"display_name",
"description",
"beta",
"documentation",
"output_types",
}
for key in template_config.copy():
if key not in attributes:
template_config.pop(key, None)
return template_config
def build_frontend_node(template_config):
"""Build a frontend node for a custom component"""
try:
sanitized_template_config = sanitize_template_config(template_config)
return CustomComponentFrontendNode(**sanitized_template_config)
except Exception as exc:
logger.error(f"Error while building base frontend node: {exc}")
raise exc
def add_code_field(frontend_node: CustomComponentFrontendNode, raw_code, field_config):
code_field = TemplateField(
dynamic=True,
required=True,
placeholder="",
multiline=True,
value=raw_code,
password=False,
name="code",
advanced=field_config.pop("advanced", False),
field_type="code",
is_list=False,
)
frontend_node.template.add_field(code_field)
return frontend_node
def build_custom_component_template(
custom_component: CustomComponent,
user_id: Optional[Union[str, UUID]] = None,
update_field: Optional[str] = None,
) -> Optional[Dict[str, Any]]:
"""Build a custom component template for the langchain"""
try:
logger.debug("Building custom component template")
frontend_node = build_frontend_node(custom_component.template_config)
logger.debug("Built base frontend node")
logger.debug("Updated attributes")
field_config = run_build_config(custom_component, user_id=user_id, update_field=update_field)
logger.debug("Built field config")
entrypoint_args = custom_component.get_function_entrypoint_args
add_extra_fields(frontend_node, field_config, entrypoint_args)
logger.debug("Added extra fields")
frontend_node = add_code_field(frontend_node, custom_component.code, field_config.get("code", {}))
logger.debug("Added code field")
add_base_classes(frontend_node, custom_component.get_function_entrypoint_return_type)
add_output_types(frontend_node, custom_component.get_function_entrypoint_return_type)
logger.debug("Added base classes")
return frontend_node.to_dict(add_name=False)
except Exception as exc:
if isinstance(exc, HTTPException):
raise exc
raise HTTPException(
status_code=400,
detail={
"error": ("Invalid type convertion. Please check your code and try again."),
"traceback": traceback.format_exc(),
},
) from exc
def create_component_template(component):
"""Create a template for a component."""
component_code = component["code"]
component_output_types = component["output_types"]
component_extractor = CustomComponent(code=component_code)
component_template = build_custom_component_template(component_extractor)
component_template["output_types"] = component_output_types
return component_template
def build_custom_components(settings_service):
"""Build custom components from the specified paths."""
if not settings_service.settings.COMPONENTS_PATH:
return {}
logger.info(f"Building custom components from {settings_service.settings.COMPONENTS_PATH}")
custom_components_from_file = {}
processed_paths = set()
for path in settings_service.settings.COMPONENTS_PATH:
path_str = str(path)
if path_str in processed_paths:
continue
custom_component_dict = build_custom_component_list_from_path(path_str)
if custom_component_dict:
category = next(iter(custom_component_dict))
logger.info(f"Loading {len(custom_component_dict[category])} component(s) from category {category}")
custom_components_from_file = merge_nested_dicts_with_renaming(
custom_components_from_file, custom_component_dict
)
processed_paths.add(path_str)
return custom_components_from_file
def update_field_dict(field_dict):
"""Update the field dictionary by calling options() or value() if they are callable"""
if "options" in field_dict and callable(field_dict["options"]):
field_dict["options"] = field_dict["options"]()
# Also update the "refresh" key
field_dict["refresh"] = True
if "value" in field_dict and callable(field_dict["value"]):
field_dict["value"] = field_dict["value"](field_dict.get("options", []))
field_dict["refresh"] = True
# Let's check if "range_spec" is a RangeSpec object
if "rangeSpec" in field_dict and isinstance(field_dict["rangeSpec"], RangeSpec):
field_dict["rangeSpec"] = field_dict["rangeSpec"].model_dump()
def sanitize_field_config(field_config: Dict):
# If any of the already existing keys are in field_config, remove them
for key in ["name", "field_type", "value", "required", "placeholder", "display_name", "advanced", "show"]:
field_config.pop(key, None)
return field_config
def build_component(component):
"""Build a single component."""
logger.debug(f"Building component: {component.get('name'), component.get('output_types')}")
component_name = determine_component_name(component)
component_template = create_component_template(component)
return component_name, component_template

View file

@ -5,6 +5,7 @@ from typing import Any, Dict, List
import orjson
from langchain.agents import ZeroShotAgent
from langchain.schema import BaseOutputParser, Document
from langflow.services.database.models.base import orjson_dumps
@ -16,6 +17,8 @@ def handle_node_type(node_type, class_object, params: Dict):
prompt = instantiate_from_template(class_object, params)
elif node_type == "ChatPromptTemplate":
prompt = class_object.from_messages(**params)
elif hasattr(class_object, "from_template") and params.get("template"):
prompt = class_object.from_template(template=params.pop("template"))
else:
prompt = class_object(**params)
return params, prompt

View file

@ -1,24 +1,10 @@
import ast
import contextlib
import re
import traceback
import warnings
from typing import Any, Dict, List, Optional, Union
from uuid import UUID
from cachetools import LRUCache, cached
from fastapi import HTTPException
from loguru import logger
from langflow.field_typing.range_spec import RangeSpec
from langflow.interface.agents.base import agent_creator
from langflow.interface.chains.base import chain_creator
from langflow.interface.custom.custom_component import CustomComponent
from langflow.interface.custom.directory_reader import DirectoryReader
from langflow.interface.custom.utils import extract_inner_type
from langflow.interface.custom.directory_reader.utils import merge_nested_dicts_with_renaming
from langflow.interface.custom.utils import build_custom_components
from langflow.interface.document_loaders.base import documentloader_creator
from langflow.interface.embeddings.base import embedding_creator
from langflow.interface.importing.utils import eval_custom_component_code
from langflow.interface.llms.base import llm_creator
from langflow.interface.memories.base import memory_creator
from langflow.interface.output_parsers.base import output_parser_creator
@ -30,9 +16,6 @@ from langflow.interface.tools.base import tool_creator
from langflow.interface.utilities.base import utility_creator
from langflow.interface.vector_store.base import vectorstore_creator
from langflow.interface.wrappers.base import wrapper_creator
from langflow.template.field.base import TemplateField
from langflow.template.frontend_node.custom_components import CustomComponentFrontendNode
from langflow.utils.util import get_base_classes
# Used to get the base_classes list
@ -80,523 +63,8 @@ def build_langchain_types_dict(): # sourcery skip: dict-assign-update-to-union
return all_types
def process_type(field_type: str):
if field_type.startswith("list") or field_type.startswith("List"):
return extract_inner_type(field_type)
return "prompt" if field_type == "Prompt" else field_type
# TODO: Move to correct place
def add_new_custom_field(
frontend_node: CustomComponentFrontendNode,
field_name: str,
field_type: str,
field_value: Any,
field_required: bool,
field_config: dict,
):
# Check field_config if any of the keys are in it
# if it is, update the value
display_name = field_config.pop("display_name", field_name)
field_type = field_config.pop("field_type", field_type)
field_contains_list = "list" in field_type.lower()
field_type = process_type(field_type)
field_value = field_config.pop("value", field_value)
field_advanced = field_config.pop("advanced", False)
if field_type == "bool" and field_value is None:
field_value = False
# If options is a list, then it's a dropdown
# If options is None, then it's a list of strings
is_list = isinstance(field_config.get("options"), list)
field_config["is_list"] = is_list or field_config.get("is_list", False) or field_contains_list
if "name" in field_config:
warnings.warn("The 'name' key in field_config is used to build the object and can't be changed.")
required = field_config.pop("required", field_required)
placeholder = field_config.pop("placeholder", "")
new_field = TemplateField(
name=field_name,
field_type=field_type,
value=field_value,
show=True,
required=required,
advanced=field_advanced,
placeholder=placeholder,
display_name=display_name,
**sanitize_field_config(field_config),
)
frontend_node.template.upsert_field(field_name, new_field)
if isinstance(frontend_node.custom_fields, dict):
frontend_node.custom_fields[field_name] = None
return frontend_node
def sanitize_field_config(field_config: Dict):
# If any of the already existing keys are in field_config, remove them
for key in ["name", "field_type", "value", "required", "placeholder", "display_name", "advanced", "show"]:
field_config.pop(key, None)
return field_config
# TODO: Move to correct place
def add_code_field(frontend_node: CustomComponentFrontendNode, raw_code, field_config):
code_field = TemplateField(
dynamic=True,
required=True,
placeholder="",
multiline=True,
value=raw_code,
password=False,
name="code",
advanced=field_config.pop("advanced", False),
field_type="code",
is_list=False,
)
frontend_node.template.add_field(code_field)
return frontend_node
def extract_type_from_optional(field_type):
"""
Extract the type from a string formatted as "Optional[<type>]".
Parameters:
field_type (str): The string from which to extract the type.
Returns:
str: The extracted type, or an empty string if no type was found.
"""
match = re.search(r"\[(.*?)\]$", field_type)
return match[1] if match else None
def build_frontend_node(template_config):
"""Build a frontend node for a custom component"""
try:
sanitized_template_config = sanitize_template_config(template_config)
return CustomComponentFrontendNode(**sanitized_template_config)
except Exception as exc:
logger.error(f"Error while building base frontend node: {exc}")
raise exc
def sanitize_template_config(template_config):
"""Sanitize the template config"""
attributes = {
"display_name",
"description",
"beta",
"documentation",
"output_types",
}
for key in template_config.copy():
if key not in attributes:
template_config.pop(key, None)
return template_config
def build_field_config(
custom_component: CustomComponent, user_id: Optional[Union[str, UUID]] = None, update_field=None
):
"""Build the field configuration for a custom component"""
try:
if custom_component.code is None:
return {}
elif isinstance(custom_component.code, str):
custom_class = eval_custom_component_code(custom_component.code)
else:
raise ValueError("Invalid code type")
except Exception as exc:
logger.error(f"Error while evaluating custom component code: {str(exc)}")
raise HTTPException(
status_code=400,
detail={
"error": ("Invalid type convertion. Please check your code and try again."),
"traceback": traceback.format_exc(),
},
) from exc
try:
build_config: Dict = custom_class(user_id=user_id).build_config()
for field_name, field in build_config.items():
# Allow user to build TemplateField as well
# as a dict with the same keys as TemplateField
field_dict = get_field_dict(field)
if update_field is not None and field_name != update_field:
continue
try:
update_field_dict(field_dict)
build_config[field_name] = field_dict
except Exception as exc:
logger.error(f"Error while getting build_config: {str(exc)}")
return build_config
except Exception as exc:
logger.error(f"Error while building field config: {str(exc)}")
raise HTTPException(
status_code=400,
detail={
"error": ("Invalid type convertion. Please check your code and try again."),
"traceback": traceback.format_exc(),
},
) from exc
def get_field_dict(field):
"""Get the field dictionary from a TemplateField or a dict"""
if isinstance(field, TemplateField):
return field.model_dump(by_alias=True, exclude_none=True)
return field
def update_field_dict(field_dict):
"""Update the field dictionary by calling options() or value() if they are callable"""
if "options" in field_dict and callable(field_dict["options"]):
field_dict["options"] = field_dict["options"]()
# Also update the "refresh" key
field_dict["refresh"] = True
if "value" in field_dict and callable(field_dict["value"]):
field_dict["value"] = field_dict["value"](field_dict.get("options", []))
field_dict["refresh"] = True
# Let's check if "range_spec" is a RangeSpec object
if "rangeSpec" in field_dict and isinstance(field_dict["rangeSpec"], RangeSpec):
field_dict["rangeSpec"] = field_dict["rangeSpec"].model_dump()
def add_extra_fields(frontend_node, field_config, function_args):
"""Add extra fields to the frontend node"""
if not function_args:
return
# sort function_args which is a list of dicts
function_args.sort(key=lambda x: x["name"])
for extra_field in function_args:
if "name" not in extra_field or extra_field["name"] == "self":
continue
field_name, field_type, field_value, field_required = get_field_properties(extra_field)
config = field_config.get(field_name, {})
frontend_node = add_new_custom_field(
frontend_node,
field_name,
field_type,
field_value,
field_required,
config,
)
def get_field_properties(extra_field):
"""Get the properties of an extra field"""
field_name = extra_field["name"]
field_type = extra_field.get("type", "str")
field_value = extra_field.get("default", "")
field_required = "optional" not in field_type.lower()
if not field_required:
field_type = extract_type_from_optional(field_type)
if field_value is not None:
with contextlib.suppress(Exception):
field_value = ast.literal_eval(field_value)
return field_name, field_type, field_value, field_required
def add_base_classes(frontend_node: CustomComponentFrontendNode, return_types: List[str]):
"""Add base classes to the frontend node"""
for return_type_instance in return_types:
if return_type_instance is None:
raise HTTPException(
status_code=400,
detail={
"error": ("Invalid return type. Please check your code and try again."),
"traceback": traceback.format_exc(),
},
)
base_classes = get_base_classes(return_type_instance)
for base_class in base_classes:
frontend_node.add_base_class(base_class)
def add_output_types(frontend_node: CustomComponentFrontendNode, return_types: List[str]):
"""Add output types to the frontend node"""
for return_type in return_types:
if return_type is None:
raise HTTPException(
status_code=400,
detail={
"error": ("Invalid return type. Please check your code and try again."),
"traceback": traceback.format_exc(),
},
)
if hasattr(return_type, "__name__"):
return_type = return_type.__name__
elif hasattr(return_type, "__class__"):
return_type = return_type.__class__.__name__
else:
return_type = str(return_type)
frontend_node.add_output_type(return_type)
def build_custom_component_template(
custom_component: CustomComponent,
user_id: Optional[Union[str, UUID]] = None,
update_field: Optional[str] = None,
) -> Optional[Dict[str, Any]]:
"""Build a custom component template for the langchain"""
try:
logger.debug("Building custom component template")
frontend_node = build_frontend_node(custom_component.template_config)
logger.debug("Built base frontend node")
logger.debug("Updated attributes")
field_config = build_field_config(custom_component, user_id=user_id, update_field=update_field)
logger.debug("Built field config")
entrypoint_args = custom_component.get_function_entrypoint_args
add_extra_fields(frontend_node, field_config, entrypoint_args)
logger.debug("Added extra fields")
frontend_node = add_code_field(frontend_node, custom_component.code, field_config.get("code", {}))
logger.debug("Added code field")
add_base_classes(frontend_node, custom_component.get_function_entrypoint_return_type)
add_output_types(frontend_node, custom_component.get_function_entrypoint_return_type)
logger.debug("Added base classes")
return frontend_node.to_dict(add_name=False)
except Exception as exc:
if isinstance(exc, HTTPException):
raise exc
raise HTTPException(
status_code=400,
detail={
"error": ("Invalid type convertion. Please check your code and try again."),
"traceback": traceback.format_exc(),
},
) from exc
def load_files_from_path(path: str):
"""Load all files from a given path"""
reader = DirectoryReader(path, False)
return reader.get_files()
def build_and_validate_all_files(reader: DirectoryReader, file_list):
"""Build and validate all files"""
data = reader.build_component_menu_list(file_list)
valid_components = reader.filter_loaded_components(data=data, with_errors=False)
invalid_components = reader.filter_loaded_components(data=data, with_errors=True)
return valid_components, invalid_components
def build_valid_menu(valid_components):
"""Build the valid menu."""
valid_menu = {}
logger.debug("------------------- VALID COMPONENTS -------------------")
for menu_item in valid_components["menu"]:
menu_name = menu_item["name"]
valid_menu[menu_name] = build_menu_items(menu_item)
return valid_menu
def build_menu_items(menu_item):
"""Build menu items for a given menu."""
menu_items = {}
for component in menu_item["components"]:
try:
component_name, component_template = build_component(component)
menu_items[component_name] = component_template
logger.debug(f"Added {component_name} to valid menu.")
except Exception as exc:
logger.error(f"Error loading Component: {component['output_types']}")
logger.exception(f"Error while building custom component {component['output_types']}: {exc}")
return menu_items
def build_component(component):
"""Build a single component."""
logger.debug(f"Building component: {component.get('name'), component.get('output_types')}")
component_name = determine_component_name(component)
component_template = create_component_template(component)
return component_name, component_template
def determine_component_name(component):
"""Determine the name of the component."""
component_output_types = component["output_types"]
if len(component_output_types) == 1:
return component_output_types[0]
else:
file_name = component.get("file").split(".")[0]
return "".join(word.capitalize() for word in file_name.split("_")) if "_" in file_name else file_name
def create_component_template(component):
"""Create a template for a component."""
component_code = component["code"]
component_output_types = component["output_types"]
component_extractor = CustomComponent(code=component_code)
component_extractor.validate()
component_template = build_custom_component_template(component_extractor)
component_template["output_types"] = component_output_types
return component_template
def build_invalid_menu(invalid_components):
"""Build the invalid menu."""
if not invalid_components.get("menu"):
return {}
logger.debug("------------------- INVALID COMPONENTS -------------------")
invalid_menu = {}
for menu_item in invalid_components["menu"]:
menu_name = menu_item["name"]
invalid_menu[menu_name] = build_invalid_menu_items(menu_item)
return invalid_menu
def build_invalid_menu_items(menu_item):
"""Build invalid menu items for a given menu."""
menu_items = {}
for component in menu_item["components"]:
try:
component_name, component_template = build_invalid_component(component)
menu_items[component_name] = component_template
logger.debug(f"Added {component_name} to invalid menu.")
except Exception as exc:
logger.exception(f"Error while creating custom component [{component_name}]: {str(exc)}")
return menu_items
def build_invalid_component(component):
"""Build a single invalid component."""
component_name = component["name"]
component_template = create_invalid_component_template(component, component_name)
log_invalid_component_details(component)
return component_name, component_template
def create_invalid_component_template(component, component_name):
"""Create a template for an invalid component."""
component_code = component["code"]
component_template = (
CustomComponentFrontendNode(
description="ERROR - Check your Python Code",
display_name=f"ERROR - {component_name}",
)
.to_dict()
.get(type(CustomComponent()).__name__)
)
component_template["error"] = component.get("error", None)
component_template.get("template").get("code")["value"] = component_code
return component_template
def log_invalid_component_details(component):
"""Log details of an invalid component."""
logger.debug(component)
logger.debug(f"Component Path: {component.get('path', None)}")
logger.debug(f"Component Error: {component.get('error', None)}")
def get_new_key(dictionary, original_key):
counter = 1
new_key = original_key + " (" + str(counter) + ")"
while new_key in dictionary:
counter += 1
new_key = original_key + " (" + str(counter) + ")"
return new_key
def merge_nested_dicts_with_renaming(dict1, dict2):
for key, value in dict2.items():
if key in dict1 and isinstance(value, dict) and isinstance(dict1.get(key), dict):
for sub_key, sub_value in value.items():
# if sub_key in dict1[key]:
# new_key = get_new_key(dict1[key], sub_key)
# dict1[key][new_key] = sub_value
# else:
dict1[key][sub_key] = sub_value
else:
dict1[key] = value
return dict1
def build_custom_component_list_from_path(path: str):
"""Build a list of custom components for the langchain from a given path"""
file_list = load_files_from_path(path)
reader = DirectoryReader(path, False)
valid_components, invalid_components = build_and_validate_all_files(reader, file_list)
valid_menu = build_valid_menu(valid_components)
invalid_menu = build_invalid_menu(invalid_components)
return merge_nested_dicts_with_renaming(valid_menu, invalid_menu)
def get_all_types_dict(settings_service):
"""Get all types dictionary combining native and custom components."""
native_components = build_langchain_types_dict()
custom_components_from_file = build_custom_components(settings_service)
return merge_nested_dicts_with_renaming(native_components, custom_components_from_file)
def build_custom_components(settings_service):
"""Build custom components from the specified paths."""
if not settings_service.settings.COMPONENTS_PATH:
return {}
logger.info(f"Building custom components from {settings_service.settings.COMPONENTS_PATH}")
custom_components_from_file = {}
processed_paths = set()
for path in settings_service.settings.COMPONENTS_PATH:
path_str = str(path)
if path_str in processed_paths:
continue
custom_component_dict = build_custom_component_list_from_path(path_str)
if custom_component_dict:
category = next(iter(custom_component_dict))
logger.info(f"Loading {len(custom_component_dict[category])} component(s) from category {category}")
custom_components_from_file = merge_nested_dicts_with_renaming(
custom_components_from_file, custom_component_dict
)
processed_paths.add(path_str)
return custom_components_from_file
def merge_nested_dicts(dict1, dict2):
for key, value in dict2.items():
if isinstance(value, dict) and isinstance(dict1.get(key), dict):
dict1[key] = merge_nested_dicts(dict1[key], value)
else:
dict1[key] = value
return dict1
def create_and_validate_component(code: str) -> CustomComponent:
component = CustomComponent(code=code)
component.validate()
return component

View file

@ -0,0 +1,52 @@
import asyncio
import json
from pathlib import Path
from typing import Optional, Union
from langflow.graph import Graph
from langflow.processing.process import fix_memory_inputs, process_tweaks
def load_flow_from_json(flow: Union[Path, str, dict], tweaks: Optional[dict] = None, build=True):
"""
Load flow from a JSON file or a JSON object.
:param flow: JSON file path or JSON object
:param tweaks: Optional tweaks to be processed
:param build: If True, build the graph, otherwise return the graph object
:return: Langchain object or Graph object depending on the build parameter
"""
# If input is a file path, load JSON from the file
if isinstance(flow, (str, Path)):
with open(flow, "r", encoding="utf-8") as f:
flow_graph = json.load(f)
# If input is a dictionary, assume it's a JSON object
elif isinstance(flow, dict):
flow_graph = flow
else:
raise TypeError("Input must be either a file path (str) or a JSON object (dict)")
graph_data = flow_graph["data"]
if tweaks is not None:
graph_data = process_tweaks(graph_data, tweaks)
nodes = graph_data["nodes"]
edges = graph_data["edges"]
graph = Graph(nodes, edges)
if build:
langchain_object = asyncio.run(graph.build())
if hasattr(langchain_object, "verbose"):
langchain_object.verbose = True
if hasattr(langchain_object, "return_intermediate_steps"):
# Deactivating until we have a frontend solution
# to display intermediate steps
langchain_object.return_intermediate_steps = False
fix_memory_inputs(langchain_object)
return langchain_object
return graph
return graph

View file

@ -1,13 +1,13 @@
import asyncio
import json
from pathlib import Path
from typing import Any, Coroutine, Dict, List, Optional, Tuple, Union
from langchain.agents import AgentExecutor
from langchain.chains.base import Chain
from langchain.schema import AgentAction, Document
from langchain.vectorstores.base import VectorStore
from langflow.graph import Graph
from langchain_core.messages import AIMessage
from langchain_core.runnables.base import Runnable
from langflow.interface.custom.custom_component import CustomComponent
from langflow.interface.run import build_sorted_vertices, get_memory_key, update_memory_keys
from langflow.services.deps import get_session_service
from loguru import logger
@ -106,10 +106,23 @@ def get_build_result(data_graph, session_id):
return build_sorted_vertices(data_graph)
def process_inputs(inputs: Optional[dict], artifacts: Dict[str, Any]) -> dict:
def process_inputs(
inputs: Optional[Union[dict, List[dict]]] = None, artifacts: Optional[Dict[str, Any]] = None
) -> Union[dict, List[dict]]:
if inputs is None:
inputs = {}
if artifacts is None:
artifacts = {}
if isinstance(inputs, dict):
inputs = update_inputs_dict(inputs, artifacts)
elif isinstance(inputs, List):
inputs = [update_inputs_dict(inp, artifacts) for inp in inputs]
return inputs
def update_inputs_dict(inputs: dict, artifacts: Dict[str, Any]) -> dict:
for key, value in artifacts.items():
if key == "repr":
continue
@ -119,23 +132,69 @@ def process_inputs(inputs: Optional[dict], artifacts: Dict[str, Any]) -> dict:
return inputs
def generate_result(langchain_object: Union[Chain, VectorStore], inputs: dict):
if isinstance(langchain_object, Chain):
async def process_runnable(runnable: Runnable, inputs: Union[dict, List[dict]]):
if isinstance(inputs, List) and hasattr(runnable, "abatch"):
result = await runnable.abatch(inputs)
elif isinstance(inputs, dict) and hasattr(runnable, "ainvoke"):
result = await runnable.ainvoke(inputs)
else:
raise ValueError(f"Runnable {runnable} does not support inputs of type {type(inputs)}")
# Check if the result is a list of AIMessages
if isinstance(result, list) and all(isinstance(r, AIMessage) for r in result):
result = [r.content for r in result]
elif isinstance(result, AIMessage):
result = result.content
return result
async def process_inputs_dict(built_object: Union[Chain, VectorStore, Runnable], inputs: dict):
if isinstance(built_object, Chain):
if inputs is None:
raise ValueError("Inputs must be provided for a Chain")
logger.debug("Generating result and thought")
result = get_result_and_thought(langchain_object, inputs)
result = get_result_and_thought(built_object, inputs)
logger.debug("Generated result and thought")
elif isinstance(langchain_object, VectorStore):
result = langchain_object.search(**inputs)
elif isinstance(langchain_object, Document):
result = langchain_object.dict()
elif isinstance(built_object, VectorStore) and "query" in inputs:
if isinstance(inputs, dict) and "search_type" not in inputs:
inputs["search_type"] = "similarity"
logger.info("search_type not provided, using default value: similarity")
result = built_object.search(**inputs)
elif isinstance(built_object, Document):
result = built_object.dict()
elif isinstance(built_object, Runnable):
result = await process_runnable(built_object, inputs)
if isinstance(result, list):
result = [r.content if hasattr(r, "content") else r for r in result]
elif hasattr(result, "content"):
result = result.content
else:
result = result
elif hasattr(built_object, "run") and isinstance(built_object, CustomComponent):
result = built_object.run(inputs)
else:
logger.warning(f"Unknown langchain_object type: {type(langchain_object)}")
if isinstance(langchain_object, Coroutine):
result = asyncio.run(langchain_object)
result = langchain_object
result = None
return result
async def process_inputs_list(built_object: Runnable, inputs: List[dict]):
return await process_runnable(built_object, inputs)
async def generate_result(built_object: Union[Chain, VectorStore, Runnable], inputs: Union[dict, List[dict]]):
if isinstance(inputs, dict):
result = await process_inputs_dict(built_object, inputs)
elif isinstance(inputs, List) and isinstance(built_object, Runnable):
result = await process_inputs_list(built_object, inputs)
else:
raise ValueError(f"Invalid inputs type: {type(inputs)}")
if result is None:
logger.warning(f"Unknown built_object type: {type(built_object)}")
if isinstance(built_object, Coroutine):
result = asyncio.run(built_object)
result = built_object
return result
@ -147,7 +206,7 @@ class Result(BaseModel):
async def process_graph_cached(
data_graph: Dict[str, Any],
inputs: Optional[dict] = None,
inputs: Optional[Union[dict, List[dict]]] = None,
clear_cache=False,
session_id=None,
) -> Result:
@ -163,7 +222,7 @@ async def process_graph_cached(
raise ValueError("Graph not found in the session")
built_object = await graph.build()
processed_inputs = process_inputs(inputs, artifacts or {})
result = generate_result(built_object, processed_inputs)
result = await generate_result(built_object, processed_inputs)
# langchain_object is now updated with the new memory
# we need to update the cache with the updated langchain_object
session_service.update_session(session_id, (graph, artifacts))
@ -171,49 +230,6 @@ async def process_graph_cached(
return Result(result=result, session_id=session_id)
def load_flow_from_json(flow: Union[Path, str, dict], tweaks: Optional[dict] = None, build=True):
"""
Load flow from a JSON file or a JSON object.
:param flow: JSON file path or JSON object
:param tweaks: Optional tweaks to be processed
:param build: If True, build the graph, otherwise return the graph object
:return: Langchain object or Graph object depending on the build parameter
"""
# If input is a file path, load JSON from the file
if isinstance(flow, (str, Path)):
with open(flow, "r", encoding="utf-8") as f:
flow_graph = json.load(f)
# If input is a dictionary, assume it's a JSON object
elif isinstance(flow, dict):
flow_graph = flow
else:
raise TypeError("Input must be either a file path (str) or a JSON object (dict)")
graph_data = flow_graph["data"]
if tweaks is not None:
graph_data = process_tweaks(graph_data, tweaks)
nodes = graph_data["nodes"]
edges = graph_data["edges"]
graph = Graph(nodes, edges)
if build:
langchain_object = asyncio.run(graph.build())
if hasattr(langchain_object, "verbose"):
langchain_object.verbose = True
if hasattr(langchain_object, "return_intermediate_steps"):
# Deactivating until we have a frontend solution
# to display intermediate steps
langchain_object.return_intermediate_steps = False
fix_memory_inputs(langchain_object)
return langchain_object
return graph
def validate_input(graph_data: Dict[str, Any], tweaks: Dict[str, Dict[str, Any]]) -> List[Dict[str, Any]]:
if not isinstance(graph_data, dict) or not isinstance(tweaks, dict):
raise ValueError("graph_data and tweaks should be dictionaries")
@ -263,3 +279,5 @@ def process_tweaks(graph_data: Dict[str, Any], tweaks: Dict[str, Dict[str, Any]]
logger.warning("Each node should be a dictionary with an 'id' key of type str")
return graph_data
return graph_data
return graph_data

View file

@ -22,6 +22,7 @@ class CacheServiceFactory(ServiceFactory):
host=settings_service.settings.REDIS_HOST,
port=settings_service.settings.REDIS_PORT,
db=settings_service.settings.REDIS_DB,
url=settings_service.settings.REDIS_URL,
expiration_time=settings_service.settings.REDIS_CACHE_EXPIRE,
)
if redis_cache.is_connected():

View file

@ -1,14 +1,13 @@
import pickle
import threading
import time
from collections import OrderedDict
from langflow.services.base import Service
from langflow.services.cache.base import BaseCacheService
import pickle
from loguru import logger
from langflow.services.base import Service
from langflow.services.cache.base import BaseCacheService
class InMemoryCache(BaseCacheService, Service):
@ -204,7 +203,7 @@ class RedisCache(BaseCacheService, Service):
b = cache["b"]
"""
def __init__(self, host="localhost", port=6379, db=0, expiration_time=60 * 60):
def __init__(self, host="localhost", port=6379, db=0, url=None, expiration_time=60 * 60):
"""
Initialize a new RedisCache instance.
@ -226,7 +225,10 @@ class RedisCache(BaseCacheService, Service):
"RedisCache is an experimental feature and may not work as expected."
" Please report any issues to our GitHub repository."
)
self._client = redis.StrictRedis(host=host, port=port, db=db)
if url:
self._client = redis.StrictRedis.from_url(url)
else:
self._client = redis.StrictRedis(host=host, port=port, db=db)
self.expiration_time = expiration_time
# check connection

View file

@ -47,6 +47,7 @@ class Settings(BaseSettings):
REDIS_HOST: str = "localhost"
REDIS_PORT: int = 6379
REDIS_DB: int = 0
REDIS_URL: Optional[str] = None
REDIS_CACHE_EXPIRE: int = 3600
# PLUGIN_DIR: Optional[str] = None

View file

@ -1,8 +1,11 @@
import traceback
from typing import Any, Callable, Optional, Tuple
import anyio
from langflow.services.task.backends.base import TaskBackend
from loguru import logger
from langflow.services.task.backends.base import TaskBackend
class AnyIOTaskResult:
def __init__(self, scope):
@ -17,6 +20,12 @@ class AnyIOTaskResult:
return "FAILURE" if self._exception is not None else "SUCCESS"
return self._status
@property
def traceback(self) -> str:
if self._traceback is not None:
return "".join(traceback.format_tb(self._traceback))
return ""
@property
def result(self) -> Any:
return self._result
@ -29,6 +38,7 @@ class AnyIOTaskResult:
self._result = await func(*args, **kwargs)
except Exception as e:
self._exception = e
self._traceback = e.__traceback__
finally:
self._status = "DONE"

View file

@ -47,7 +47,6 @@ class Settings(BaseSettings):
else:
logger.debug("No DATABASE_URL env variable, using sqlite database")
value = "sqlite:///./langflow.db"
return value
@validator("COMPONENTS_PATH", pre=True)

View file

@ -146,57 +146,6 @@ class CSVAgentNode(FrontendNode):
base_classes: list[str] = ["AgentExecutor"]
class InitializeAgentNode(FrontendNode):
name: str = "AgentInitializer"
display_name: str = "AgentInitializer"
template: Template = Template(
type_name="initialize_agent",
fields=[
TemplateField(
field_type="str", # pyright: ignore
required=True,
is_list=True, # pyright: ignore
show=True,
multiline=False,
options=list(NON_CHAT_AGENTS.keys()),
value=list(NON_CHAT_AGENTS.keys())[0],
name="agent",
advanced=False,
),
TemplateField(
field_type="BaseChatMemory", # pyright: ignore
required=False,
show=True,
name="memory",
advanced=False,
),
TemplateField(
field_type="Tool", # pyright: ignore
required=True,
show=True,
name="tools",
is_list=True, # pyright: ignore
advanced=False,
),
TemplateField(
field_type="BaseLanguageModel", # pyright: ignore
required=True,
show=True,
name="llm",
display_name="LLM",
advanced=False,
),
],
)
description: str = """Construct a zero shot agent from an LLM and tools."""
base_classes: list[str] = ["AgentExecutor", "Callable"]
@staticmethod
def format_field(field: TemplateField, name: Optional[str] = None) -> None:
# do nothing and don't return anything
pass
class JsonAgentNode(FrontendNode):
name: str = "JsonAgent"
template: Template = Template(

View file

@ -1,14 +1,15 @@
from typing import Optional
from langchain_community.chat_message_histories.mongodb import (
DEFAULT_COLLECTION_NAME,
DEFAULT_DBNAME,
)
from langchain_community.chat_message_histories.postgres import DEFAULT_CONNECTION_STRING
from langflow.template.field.base import TemplateField
from langflow.template.frontend_node.base import FrontendNode
from langflow.template.frontend_node.constants import INPUT_KEY_INFO, OUTPUT_KEY_INFO
from langflow.template.template.base import Template
from langchain.memory.chat_message_histories.postgres import DEFAULT_CONNECTION_STRING
from langchain.memory.chat_message_histories.mongodb import (
DEFAULT_COLLECTION_NAME,
DEFAULT_DBNAME,
)
class MemoryFrontendNode(FrontendNode):

View file

@ -2,7 +2,7 @@ import ast
import contextlib
import importlib
from types import FunctionType
from typing import Dict
from typing import Dict, List, Optional, Union
from langflow.field_typing.constants import CUSTOM_COMPONENT_SUPPORTED_TYPES
@ -260,14 +260,13 @@ def get_default_imports(code_string):
"""
Returns a dictionary of default imports for the dynamic class constructor.
"""
typing_module = importlib.import_module("typing")
default_imports = {
"Optional": typing_module.Optional,
"List": typing_module.List,
"Dict": typing_module.Dict,
"Union": typing_module.Union,
}
default_imports = {
"Optional": Optional,
"List": List,
"Dict": Dict,
"Union": Union,
}
langflow_imports = list(CUSTOM_COMPONENT_SUPPORTED_TYPES.keys())
necessary_imports = find_names_in_code(code_string, langflow_imports)
langflow_module = importlib.import_module("langflow.field_typing")

View file

@ -1,4 +1,4 @@
from typing import TYPE_CHECKING, Any, Dict, Optional
from typing import TYPE_CHECKING, Any, Dict, List, Optional, Union
from asgiref.sync import async_to_sync
from celery.exceptions import SoftTimeLimitExceeded # type: ignore
@ -34,7 +34,7 @@ def build_vertex(self, vertex: "Vertex") -> "Vertex":
@celery_app.task(acks_late=True)
def process_graph_cached_task(
data_graph: Dict[str, Any],
inputs: Optional[dict] = None,
inputs: Optional[Union[dict, List[dict]]] = None,
clear_cache=False,
session_id=None,
) -> Dict[str, Any]:
@ -62,7 +62,7 @@ def process_graph_cached_task(
logger.debug(f"Built object: {built_object}")
processed_inputs = process_inputs(inputs, artifacts or {})
result = generate_result(built_object, processed_inputs)
result = async_to_sync(generate_result, force_new_loop=True)(built_object, processed_inputs)
# Update the session with the new data
session_service.update_session(session_id, (graph, artifacts))

View file

@ -0,0 +1,26 @@
#baseline
FROM --platform=linux/amd64 node:19-bullseye-slim AS base
RUN mkdir -p /home/node/app
RUN chown -R node:node /home/node && chmod -R 770 /home/node
RUN apt-get update && apt-get install -y jq curl
WORKDIR /home/node/app
# client build
FROM base AS builder-client
ARG BACKEND_URL
ENV BACKEND_URL $BACKEND_URL
RUN echo "BACKEND_URL: $BACKEND_URL"
WORKDIR /home/node/app
COPY --chown=node:node . ./
COPY ./set_proxy.sh .
RUN chmod +x set_proxy.sh && \
cat set_proxy.sh | tr -d '\r' > set_proxy_unix.sh && \
chmod +x set_proxy_unix.sh && \
./set_proxy_unix.sh
USER node
RUN npm install --loglevel warn
CMD ["npm", "run", "dev:docker"]

View file

@ -94,6 +94,7 @@
"prettier": "^2.8.8",
"prettier-plugin-organize-imports": "^3.2.3",
"prettier-plugin-tailwindcss": "^0.3.0",
"pretty-quick": "^3.1.3",
"tailwindcss": "^3.3.3",
"typescript": "^5.2.2",
"vite": "^4.5.1"
@ -3774,6 +3775,12 @@
"@types/unist": "^2"
}
},
"node_modules/@types/minimatch": {
"version": "3.0.5",
"resolved": "https://registry.npmjs.org/@types/minimatch/-/minimatch-3.0.5.tgz",
"integrity": "sha512-Klz949h02Gz2uZCMGwDUSDS1YBlTdDDgbWHi+81l29tQALUtvz4rAYi5uoVhE5Lagoq6DeqAUlbrHvW/mXDgdQ==",
"dev": true
},
"node_modules/@types/ms": {
"version": "0.7.34",
"resolved": "https://registry.npmjs.org/@types/ms/-/ms-0.7.34.tgz",
@ -4058,6 +4065,33 @@
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/array-differ": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/array-differ/-/array-differ-3.0.0.tgz",
"integrity": "sha512-THtfYS6KtME/yIAhKjZ2ul7XI96lQGHRputJQHO80LAWQnuGP4iCIN8vdMRboGbIEYBwU33q8Tch1os2+X0kMg==",
"dev": true,
"engines": {
"node": ">=8"
}
},
"node_modules/array-union": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/array-union/-/array-union-2.1.0.tgz",
"integrity": "sha512-HGyxoOTYUyCM6stUe6EJgnd4EoewAI7zMdfqO+kGjnlZmBDz/cR5pf8r/cR4Wq60sL/p0IkcjUEEPwS3GFrIyw==",
"dev": true,
"engines": {
"node": ">=8"
}
},
"node_modules/arrify": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/arrify/-/arrify-2.0.1.tgz",
"integrity": "sha512-3duEwti880xqi4eAMN8AyR4a0ByT90zoYdLlevfrvU43vb0YZwZVfxOgxWrLXXXpyugL0hNZc9G6BiB5B3nUug==",
"dev": true,
"engines": {
"node": ">=8"
}
},
"node_modules/astral-regex": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/astral-regex/-/astral-regex-2.0.0.tgz",
@ -5611,6 +5645,19 @@
"resolved": "https://registry.npmjs.org/find-root/-/find-root-1.1.0.tgz",
"integrity": "sha512-NKfW6bec6GfKc0SGx1e07QZY9PE99u0Bft/0rzSD5k3sO/vwkVUpDUKVm5Gpp5Ue3YfShPFTX2070tDs5kB9Ng=="
},
"node_modules/find-up": {
"version": "4.1.0",
"resolved": "https://registry.npmjs.org/find-up/-/find-up-4.1.0.tgz",
"integrity": "sha512-PpOwAdQ/YlXQ2vj8a3h8IipDuYRi3wceVQQGYWxNINccq40Anw7BlsEXCMbt1Zt+OLA6Fq9suIpIWD0OsnISlw==",
"dev": true,
"dependencies": {
"locate-path": "^5.0.0",
"path-exists": "^4.0.0"
},
"engines": {
"node": ">=8"
}
},
"node_modules/find-versions": {
"version": "5.1.0",
"resolved": "https://registry.npmjs.org/find-versions/-/find-versions-5.1.0.tgz",
@ -6195,6 +6242,15 @@
}
]
},
"node_modules/ignore": {
"version": "5.3.0",
"resolved": "https://registry.npmjs.org/ignore/-/ignore-5.3.0.tgz",
"integrity": "sha512-g7dmpshy+gD7mh88OC9NwSGTKoc3kyLAZQRU1mt53Aw/vnvfXnbC+F/7F7QoYVKbV+KNvJx8wArewKy1vXMtlg==",
"dev": true,
"engines": {
"node": ">= 4"
}
},
"node_modules/import-fresh": {
"version": "3.3.0",
"resolved": "https://registry.npmjs.org/import-fresh/-/import-fresh-3.3.0.tgz",
@ -6862,6 +6918,18 @@
"resolved": "https://registry.npmjs.org/lines-and-columns/-/lines-and-columns-1.2.4.tgz",
"integrity": "sha512-7ylylesZQ/PV29jhEDl3Ufjo6ZX7gCqJr5F7PKrqc93v7fzSymt1BpwEU8nAUXs8qzzvqhbjhK5QZg6Mt/HkBg=="
},
"node_modules/locate-path": {
"version": "5.0.0",
"resolved": "https://registry.npmjs.org/locate-path/-/locate-path-5.0.0.tgz",
"integrity": "sha512-t7hw9pI+WvuwNJXwk5zVHpyhIqzg2qTlklJOf0mVxGSbe3Fp2VieZcduNYjaLDoy6p9uGpQEGWG87WpMKlNq8g==",
"dev": true,
"dependencies": {
"p-locate": "^4.1.0"
},
"engines": {
"node": ">=8"
}
},
"node_modules/lodash": {
"version": "4.17.21",
"resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz",
@ -7946,6 +8014,22 @@
"resolved": "https://registry.npmjs.org/ms/-/ms-2.1.2.tgz",
"integrity": "sha512-sGkPx+VjMtmA6MX27oA4FBFELFCZZ4S4XqeGOXCv68tT+jb3vk/RyaKWP0PTKyWtmLSM0b+adUTEvbs1PEaH2w=="
},
"node_modules/multimatch": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/multimatch/-/multimatch-4.0.0.tgz",
"integrity": "sha512-lDmx79y1z6i7RNx0ZGCPq1bzJ6ZoDDKbvh7jxr9SJcWLkShMzXrHbYVpTdnhNM5MXpDUxCQ4DgqVttVXlBgiBQ==",
"dev": true,
"dependencies": {
"@types/minimatch": "^3.0.3",
"array-differ": "^3.0.0",
"array-union": "^2.1.0",
"arrify": "^2.0.1",
"minimatch": "^3.0.4"
},
"engines": {
"node": ">=8"
}
},
"node_modules/mz": {
"version": "2.7.0",
"resolved": "https://registry.npmjs.org/mz/-/mz-2.7.0.tgz",
@ -8221,6 +8305,42 @@
"node": ">=4"
}
},
"node_modules/p-limit": {
"version": "2.3.0",
"resolved": "https://registry.npmjs.org/p-limit/-/p-limit-2.3.0.tgz",
"integrity": "sha512-//88mFWSJx8lxCzwdAABTJL2MyWB12+eIY7MDL2SqLmAkeKU9qxRvWuSyTjm3FUmpBEMuFfckAIqEaVGUDxb6w==",
"dev": true,
"dependencies": {
"p-try": "^2.0.0"
},
"engines": {
"node": ">=6"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/p-locate": {
"version": "4.1.0",
"resolved": "https://registry.npmjs.org/p-locate/-/p-locate-4.1.0.tgz",
"integrity": "sha512-R79ZZ/0wAxKGu3oYMlz8jy/kbhsNrS7SKZ7PxEHBgJ5+F2mtFW2fK2cOtBh1cHYkQsbzFV7I+EoRKe6Yt0oK7A==",
"dev": true,
"dependencies": {
"p-limit": "^2.2.0"
},
"engines": {
"node": ">=8"
}
},
"node_modules/p-try": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/p-try/-/p-try-2.2.0.tgz",
"integrity": "sha512-R4nPAVTAU0B9D35/Gk3uJf/7XYbQcyohSKdvAxIRSNghFl4e71hVoGnBNQz9cWaXxO2I10KTC+3jMdvvoKw6dQ==",
"dev": true,
"engines": {
"node": ">=6"
}
},
"node_modules/parent-module": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/parent-module/-/parent-module-1.0.1.tgz",
@ -8293,6 +8413,15 @@
"resolved": "https://registry.npmjs.org/path-browserify/-/path-browserify-1.0.1.tgz",
"integrity": "sha512-b7uo2UCUOYZcnF/3ID0lulOJi/bafxa1xPe7ZPsammBSpjSWQkjNxlt635YGS2MiR9GjvuXCtz2emr3jbsz98g=="
},
"node_modules/path-exists": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/path-exists/-/path-exists-4.0.0.tgz",
"integrity": "sha512-ak9Qy5Q7jYb2Wwcey5Fpvg2KoAc/ZIhLSLOSBmRmygPsGwkVVt0fZa0qrtMz+m6tJTAHfZQ8FnmB4MG4LWy7/w==",
"dev": true,
"engines": {
"node": ">=8"
}
},
"node_modules/path-is-absolute": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/path-is-absolute/-/path-is-absolute-1.0.1.tgz",
@ -8684,6 +8813,172 @@
"integrity": "sha512-w2GsyukL62IJnlaff/nRegPQR94C/XXamvMWmSHRJ4y7Ts/4ocGRmTHvOs8PSE6pB3dWOrD/nueuU5sduBsQ4w==",
"dev": true
},
"node_modules/pretty-quick": {
"version": "3.1.3",
"resolved": "https://registry.npmjs.org/pretty-quick/-/pretty-quick-3.1.3.tgz",
"integrity": "sha512-kOCi2FJabvuh1as9enxYmrnBC6tVMoVOenMaBqRfsvBHB0cbpYHjdQEpSglpASDFEXVwplpcGR4CLEaisYAFcA==",
"dev": true,
"dependencies": {
"chalk": "^3.0.0",
"execa": "^4.0.0",
"find-up": "^4.1.0",
"ignore": "^5.1.4",
"mri": "^1.1.5",
"multimatch": "^4.0.0"
},
"bin": {
"pretty-quick": "bin/pretty-quick.js"
},
"engines": {
"node": ">=10.13"
},
"peerDependencies": {
"prettier": ">=2.0.0"
}
},
"node_modules/pretty-quick/node_modules/chalk": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/chalk/-/chalk-3.0.0.tgz",
"integrity": "sha512-4D3B6Wf41KOYRFdszmDqMCGq5VV/uMAB273JILmO+3jAlh8X4qDtdtgCR3fxtbLEMzSx22QdhnDcJvu2u1fVwg==",
"dev": true,
"dependencies": {
"ansi-styles": "^4.1.0",
"supports-color": "^7.1.0"
},
"engines": {
"node": ">=8"
}
},
"node_modules/pretty-quick/node_modules/cross-spawn": {
"version": "7.0.3",
"resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.3.tgz",
"integrity": "sha512-iRDPJKUPVEND7dHPO8rkbOnPpyDygcDFtWjpeWNCgy8WP2rXcxXL8TskReQl6OrB2G7+UJrags1q15Fudc7G6w==",
"dev": true,
"dependencies": {
"path-key": "^3.1.0",
"shebang-command": "^2.0.0",
"which": "^2.0.1"
},
"engines": {
"node": ">= 8"
}
},
"node_modules/pretty-quick/node_modules/execa": {
"version": "4.1.0",
"resolved": "https://registry.npmjs.org/execa/-/execa-4.1.0.tgz",
"integrity": "sha512-j5W0//W7f8UxAn8hXVnwG8tLwdiUy4FJLcSupCg6maBYZDpyBvTApK7KyuI4bKj8KOh1r2YH+6ucuYtJv1bTZA==",
"dev": true,
"dependencies": {
"cross-spawn": "^7.0.0",
"get-stream": "^5.0.0",
"human-signals": "^1.1.1",
"is-stream": "^2.0.0",
"merge-stream": "^2.0.0",
"npm-run-path": "^4.0.0",
"onetime": "^5.1.0",
"signal-exit": "^3.0.2",
"strip-final-newline": "^2.0.0"
},
"engines": {
"node": ">=10"
},
"funding": {
"url": "https://github.com/sindresorhus/execa?sponsor=1"
}
},
"node_modules/pretty-quick/node_modules/get-stream": {
"version": "5.2.0",
"resolved": "https://registry.npmjs.org/get-stream/-/get-stream-5.2.0.tgz",
"integrity": "sha512-nBF+F1rAZVCu/p7rjzgA+Yb4lfYXrpl7a6VmJrU8wF9I1CKvP/QwPNZHnOlwbTkY6dvtFIzFMSyQXbLoTQPRpA==",
"dev": true,
"dependencies": {
"pump": "^3.0.0"
},
"engines": {
"node": ">=8"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/pretty-quick/node_modules/human-signals": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/human-signals/-/human-signals-1.1.1.tgz",
"integrity": "sha512-SEQu7vl8KjNL2eoGBLF3+wAjpsNfA9XMlXAYj/3EdaNfAlxKthD1xjEQfGOUhllCGGJVNY34bRr6lPINhNjyZw==",
"dev": true,
"engines": {
"node": ">=8.12.0"
}
},
"node_modules/pretty-quick/node_modules/is-stream": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/is-stream/-/is-stream-2.0.1.tgz",
"integrity": "sha512-hFoiJiTl63nn+kstHGBtewWSKnQLpyb155KHheA1l39uvtO9nWIop1p3udqPcUd/xbF1VLMO4n7OI6p7RbngDg==",
"dev": true,
"engines": {
"node": ">=8"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/pretty-quick/node_modules/npm-run-path": {
"version": "4.0.1",
"resolved": "https://registry.npmjs.org/npm-run-path/-/npm-run-path-4.0.1.tgz",
"integrity": "sha512-S48WzZW777zhNIrn7gxOlISNAqi9ZC/uQFnRdbeIHhZhCA6UqpkOT8T1G7BvfdgP4Er8gF4sUbaS0i7QvIfCWw==",
"dev": true,
"dependencies": {
"path-key": "^3.0.0"
},
"engines": {
"node": ">=8"
}
},
"node_modules/pretty-quick/node_modules/path-key": {
"version": "3.1.1",
"resolved": "https://registry.npmjs.org/path-key/-/path-key-3.1.1.tgz",
"integrity": "sha512-ojmeN0qd+y0jszEtoY48r0Peq5dwMEkIlCOu6Q5f41lfkswXuKtYrhgoTpLnyIcHm24Uhqx+5Tqm2InSwLhE6Q==",
"dev": true,
"engines": {
"node": ">=8"
}
},
"node_modules/pretty-quick/node_modules/shebang-command": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/shebang-command/-/shebang-command-2.0.0.tgz",
"integrity": "sha512-kHxr2zZpYtdmrN1qDjrrX/Z1rR1kG8Dx+gkpK1G4eXmvXswmcE1hTWBWYUzlraYw1/yZp6YuDY77YtvbN0dmDA==",
"dev": true,
"dependencies": {
"shebang-regex": "^3.0.0"
},
"engines": {
"node": ">=8"
}
},
"node_modules/pretty-quick/node_modules/shebang-regex": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/shebang-regex/-/shebang-regex-3.0.0.tgz",
"integrity": "sha512-7++dFhtcx3353uBaq8DDR4NuxBetBzC7ZQOhmTQInHEd6bSrXdiEyzCvG07Z44UYdLShWUyXt5M/yhz8ekcb1A==",
"dev": true,
"engines": {
"node": ">=8"
}
},
"node_modules/pretty-quick/node_modules/which": {
"version": "2.0.2",
"resolved": "https://registry.npmjs.org/which/-/which-2.0.2.tgz",
"integrity": "sha512-BLI3Tl1TW3Pvl70l3yq3Y64i+awpwXqsGBYWkkqMtnbXgrMD+yj7rhW0kuEDxzJaYXGjEW5ogapKNMEKNMjibA==",
"dev": true,
"dependencies": {
"isexe": "^2.0.0"
},
"bin": {
"node-which": "bin/node-which"
},
"engines": {
"node": ">= 8"
}
},
"node_modules/prismjs": {
"version": "1.29.0",
"resolved": "https://registry.npmjs.org/prismjs/-/prismjs-1.29.0.tgz",
@ -13738,6 +14033,12 @@
"@types/unist": "^2"
}
},
"@types/minimatch": {
"version": "3.0.5",
"resolved": "https://registry.npmjs.org/@types/minimatch/-/minimatch-3.0.5.tgz",
"integrity": "sha512-Klz949h02Gz2uZCMGwDUSDS1YBlTdDDgbWHi+81l29tQALUtvz4rAYi5uoVhE5Lagoq6DeqAUlbrHvW/mXDgdQ==",
"dev": true
},
"@types/ms": {
"version": "0.7.34",
"resolved": "https://registry.npmjs.org/@types/ms/-/ms-0.7.34.tgz",
@ -13965,6 +14266,24 @@
"is-array-buffer": "^3.0.1"
}
},
"array-differ": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/array-differ/-/array-differ-3.0.0.tgz",
"integrity": "sha512-THtfYS6KtME/yIAhKjZ2ul7XI96lQGHRputJQHO80LAWQnuGP4iCIN8vdMRboGbIEYBwU33q8Tch1os2+X0kMg==",
"dev": true
},
"array-union": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/array-union/-/array-union-2.1.0.tgz",
"integrity": "sha512-HGyxoOTYUyCM6stUe6EJgnd4EoewAI7zMdfqO+kGjnlZmBDz/cR5pf8r/cR4Wq60sL/p0IkcjUEEPwS3GFrIyw==",
"dev": true
},
"arrify": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/arrify/-/arrify-2.0.1.tgz",
"integrity": "sha512-3duEwti880xqi4eAMN8AyR4a0ByT90zoYdLlevfrvU43vb0YZwZVfxOgxWrLXXXpyugL0hNZc9G6BiB5B3nUug==",
"dev": true
},
"astral-regex": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/astral-regex/-/astral-regex-2.0.0.tgz",
@ -15040,6 +15359,16 @@
"resolved": "https://registry.npmjs.org/find-root/-/find-root-1.1.0.tgz",
"integrity": "sha512-NKfW6bec6GfKc0SGx1e07QZY9PE99u0Bft/0rzSD5k3sO/vwkVUpDUKVm5Gpp5Ue3YfShPFTX2070tDs5kB9Ng=="
},
"find-up": {
"version": "4.1.0",
"resolved": "https://registry.npmjs.org/find-up/-/find-up-4.1.0.tgz",
"integrity": "sha512-PpOwAdQ/YlXQ2vj8a3h8IipDuYRi3wceVQQGYWxNINccq40Anw7BlsEXCMbt1Zt+OLA6Fq9suIpIWD0OsnISlw==",
"dev": true,
"requires": {
"locate-path": "^5.0.0",
"path-exists": "^4.0.0"
}
},
"find-versions": {
"version": "5.1.0",
"resolved": "https://registry.npmjs.org/find-versions/-/find-versions-5.1.0.tgz",
@ -15436,6 +15765,12 @@
"resolved": "https://registry.npmjs.org/ieee754/-/ieee754-1.2.1.tgz",
"integrity": "sha512-dcyqhDvX1C46lXZcVqCpK+FtMRQVdIMN6/Df5js2zouUsqG7I6sFxitIC+7KYK29KdXOLHdu9zL4sFnoVQnqaA=="
},
"ignore": {
"version": "5.3.0",
"resolved": "https://registry.npmjs.org/ignore/-/ignore-5.3.0.tgz",
"integrity": "sha512-g7dmpshy+gD7mh88OC9NwSGTKoc3kyLAZQRU1mt53Aw/vnvfXnbC+F/7F7QoYVKbV+KNvJx8wArewKy1vXMtlg==",
"dev": true
},
"import-fresh": {
"version": "3.3.0",
"resolved": "https://registry.npmjs.org/import-fresh/-/import-fresh-3.3.0.tgz",
@ -15893,6 +16228,15 @@
"resolved": "https://registry.npmjs.org/lines-and-columns/-/lines-and-columns-1.2.4.tgz",
"integrity": "sha512-7ylylesZQ/PV29jhEDl3Ufjo6ZX7gCqJr5F7PKrqc93v7fzSymt1BpwEU8nAUXs8qzzvqhbjhK5QZg6Mt/HkBg=="
},
"locate-path": {
"version": "5.0.0",
"resolved": "https://registry.npmjs.org/locate-path/-/locate-path-5.0.0.tgz",
"integrity": "sha512-t7hw9pI+WvuwNJXwk5zVHpyhIqzg2qTlklJOf0mVxGSbe3Fp2VieZcduNYjaLDoy6p9uGpQEGWG87WpMKlNq8g==",
"dev": true,
"requires": {
"p-locate": "^4.1.0"
}
},
"lodash": {
"version": "4.17.21",
"resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz",
@ -16597,6 +16941,19 @@
"resolved": "https://registry.npmjs.org/ms/-/ms-2.1.2.tgz",
"integrity": "sha512-sGkPx+VjMtmA6MX27oA4FBFELFCZZ4S4XqeGOXCv68tT+jb3vk/RyaKWP0PTKyWtmLSM0b+adUTEvbs1PEaH2w=="
},
"multimatch": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/multimatch/-/multimatch-4.0.0.tgz",
"integrity": "sha512-lDmx79y1z6i7RNx0ZGCPq1bzJ6ZoDDKbvh7jxr9SJcWLkShMzXrHbYVpTdnhNM5MXpDUxCQ4DgqVttVXlBgiBQ==",
"dev": true,
"requires": {
"@types/minimatch": "^3.0.3",
"array-differ": "^3.0.0",
"array-union": "^2.1.0",
"arrify": "^2.0.1",
"minimatch": "^3.0.4"
}
},
"mz": {
"version": "2.7.0",
"resolved": "https://registry.npmjs.org/mz/-/mz-2.7.0.tgz",
@ -16776,6 +17133,30 @@
"integrity": "sha512-LICb2p9CB7FS+0eR1oqWnHhp0FljGLZCWBE9aix0Uye9W8LTQPwMTYVGWQWIw9RdQiDg4+epXQODwIYJtSJaow==",
"dev": true
},
"p-limit": {
"version": "2.3.0",
"resolved": "https://registry.npmjs.org/p-limit/-/p-limit-2.3.0.tgz",
"integrity": "sha512-//88mFWSJx8lxCzwdAABTJL2MyWB12+eIY7MDL2SqLmAkeKU9qxRvWuSyTjm3FUmpBEMuFfckAIqEaVGUDxb6w==",
"dev": true,
"requires": {
"p-try": "^2.0.0"
}
},
"p-locate": {
"version": "4.1.0",
"resolved": "https://registry.npmjs.org/p-locate/-/p-locate-4.1.0.tgz",
"integrity": "sha512-R79ZZ/0wAxKGu3oYMlz8jy/kbhsNrS7SKZ7PxEHBgJ5+F2mtFW2fK2cOtBh1cHYkQsbzFV7I+EoRKe6Yt0oK7A==",
"dev": true,
"requires": {
"p-limit": "^2.2.0"
}
},
"p-try": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/p-try/-/p-try-2.2.0.tgz",
"integrity": "sha512-R4nPAVTAU0B9D35/Gk3uJf/7XYbQcyohSKdvAxIRSNghFl4e71hVoGnBNQz9cWaXxO2I10KTC+3jMdvvoKw6dQ==",
"dev": true
},
"parent-module": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/parent-module/-/parent-module-1.0.1.tgz",
@ -16828,6 +17209,12 @@
"resolved": "https://registry.npmjs.org/path-browserify/-/path-browserify-1.0.1.tgz",
"integrity": "sha512-b7uo2UCUOYZcnF/3ID0lulOJi/bafxa1xPe7ZPsammBSpjSWQkjNxlt635YGS2MiR9GjvuXCtz2emr3jbsz98g=="
},
"path-exists": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/path-exists/-/path-exists-4.0.0.tgz",
"integrity": "sha512-ak9Qy5Q7jYb2Wwcey5Fpvg2KoAc/ZIhLSLOSBmRmygPsGwkVVt0fZa0qrtMz+m6tJTAHfZQ8FnmB4MG4LWy7/w==",
"dev": true
},
"path-is-absolute": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/path-is-absolute/-/path-is-absolute-1.0.1.tgz",
@ -17014,6 +17401,120 @@
}
}
},
"pretty-quick": {
"version": "3.1.3",
"resolved": "https://registry.npmjs.org/pretty-quick/-/pretty-quick-3.1.3.tgz",
"integrity": "sha512-kOCi2FJabvuh1as9enxYmrnBC6tVMoVOenMaBqRfsvBHB0cbpYHjdQEpSglpASDFEXVwplpcGR4CLEaisYAFcA==",
"dev": true,
"requires": {
"chalk": "^3.0.0",
"execa": "^4.0.0",
"find-up": "^4.1.0",
"ignore": "^5.1.4",
"mri": "^1.1.5",
"multimatch": "^4.0.0"
},
"dependencies": {
"chalk": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/chalk/-/chalk-3.0.0.tgz",
"integrity": "sha512-4D3B6Wf41KOYRFdszmDqMCGq5VV/uMAB273JILmO+3jAlh8X4qDtdtgCR3fxtbLEMzSx22QdhnDcJvu2u1fVwg==",
"dev": true,
"requires": {
"ansi-styles": "^4.1.0",
"supports-color": "^7.1.0"
}
},
"cross-spawn": {
"version": "7.0.3",
"resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.3.tgz",
"integrity": "sha512-iRDPJKUPVEND7dHPO8rkbOnPpyDygcDFtWjpeWNCgy8WP2rXcxXL8TskReQl6OrB2G7+UJrags1q15Fudc7G6w==",
"dev": true,
"requires": {
"path-key": "^3.1.0",
"shebang-command": "^2.0.0",
"which": "^2.0.1"
}
},
"execa": {
"version": "4.1.0",
"resolved": "https://registry.npmjs.org/execa/-/execa-4.1.0.tgz",
"integrity": "sha512-j5W0//W7f8UxAn8hXVnwG8tLwdiUy4FJLcSupCg6maBYZDpyBvTApK7KyuI4bKj8KOh1r2YH+6ucuYtJv1bTZA==",
"dev": true,
"requires": {
"cross-spawn": "^7.0.0",
"get-stream": "^5.0.0",
"human-signals": "^1.1.1",
"is-stream": "^2.0.0",
"merge-stream": "^2.0.0",
"npm-run-path": "^4.0.0",
"onetime": "^5.1.0",
"signal-exit": "^3.0.2",
"strip-final-newline": "^2.0.0"
}
},
"get-stream": {
"version": "5.2.0",
"resolved": "https://registry.npmjs.org/get-stream/-/get-stream-5.2.0.tgz",
"integrity": "sha512-nBF+F1rAZVCu/p7rjzgA+Yb4lfYXrpl7a6VmJrU8wF9I1CKvP/QwPNZHnOlwbTkY6dvtFIzFMSyQXbLoTQPRpA==",
"dev": true,
"requires": {
"pump": "^3.0.0"
}
},
"human-signals": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/human-signals/-/human-signals-1.1.1.tgz",
"integrity": "sha512-SEQu7vl8KjNL2eoGBLF3+wAjpsNfA9XMlXAYj/3EdaNfAlxKthD1xjEQfGOUhllCGGJVNY34bRr6lPINhNjyZw==",
"dev": true
},
"is-stream": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/is-stream/-/is-stream-2.0.1.tgz",
"integrity": "sha512-hFoiJiTl63nn+kstHGBtewWSKnQLpyb155KHheA1l39uvtO9nWIop1p3udqPcUd/xbF1VLMO4n7OI6p7RbngDg==",
"dev": true
},
"npm-run-path": {
"version": "4.0.1",
"resolved": "https://registry.npmjs.org/npm-run-path/-/npm-run-path-4.0.1.tgz",
"integrity": "sha512-S48WzZW777zhNIrn7gxOlISNAqi9ZC/uQFnRdbeIHhZhCA6UqpkOT8T1G7BvfdgP4Er8gF4sUbaS0i7QvIfCWw==",
"dev": true,
"requires": {
"path-key": "^3.0.0"
}
},
"path-key": {
"version": "3.1.1",
"resolved": "https://registry.npmjs.org/path-key/-/path-key-3.1.1.tgz",
"integrity": "sha512-ojmeN0qd+y0jszEtoY48r0Peq5dwMEkIlCOu6Q5f41lfkswXuKtYrhgoTpLnyIcHm24Uhqx+5Tqm2InSwLhE6Q==",
"dev": true
},
"shebang-command": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/shebang-command/-/shebang-command-2.0.0.tgz",
"integrity": "sha512-kHxr2zZpYtdmrN1qDjrrX/Z1rR1kG8Dx+gkpK1G4eXmvXswmcE1hTWBWYUzlraYw1/yZp6YuDY77YtvbN0dmDA==",
"dev": true,
"requires": {
"shebang-regex": "^3.0.0"
}
},
"shebang-regex": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/shebang-regex/-/shebang-regex-3.0.0.tgz",
"integrity": "sha512-7++dFhtcx3353uBaq8DDR4NuxBetBzC7ZQOhmTQInHEd6bSrXdiEyzCvG07Z44UYdLShWUyXt5M/yhz8ekcb1A==",
"dev": true
},
"which": {
"version": "2.0.2",
"resolved": "https://registry.npmjs.org/which/-/which-2.0.2.tgz",
"integrity": "sha512-BLI3Tl1TW3Pvl70l3yq3Y64i+awpwXqsGBYWkkqMtnbXgrMD+yj7rhW0kuEDxzJaYXGjEW5ogapKNMEKNMjibA==",
"dev": true,
"requires": {
"isexe": "^2.0.0"
}
}
}
},
"prismjs": {
"version": "1.29.0",
"resolved": "https://registry.npmjs.org/prismjs/-/prismjs-1.29.0.tgz",

View file

@ -116,6 +116,7 @@
"prettier": "^2.8.8",
"prettier-plugin-organize-imports": "^3.2.3",
"prettier-plugin-tailwindcss": "^0.3.0",
"pretty-quick": "^3.1.3",
"tailwindcss": "^3.3.3",
"typescript": "^5.2.2",
"vite": "^4.5.1"

View file

@ -194,7 +194,10 @@ export default function GenericNode({
takeSnapshot();
}}
>
<div className="generic-node-tooltip-div pr-2 text-primary">
<div
data-testid={"title-" + data.node?.display_name}
className="generic-node-tooltip-div pr-2 text-primary"
>
{data.node?.display_name}
</div>
{nameEditable && (
@ -364,10 +367,9 @@ export default function GenericNode({
<div
className={
showNode
? "overflow-hidden " +
(data.node?.description === "" && !nameEditable
? "pb-5"
: "py-5")
? data.node?.description === "" && !nameEditable
? "pb-5"
: "py-5"
: ""
}
>

View file

@ -6,11 +6,13 @@ export default function PageLayout({
description,
children,
button,
betaIcon,
}: {
title: string;
description: string;
children: React.ReactNode;
button?: React.ReactNode;
betaIcon: boolean;
}) {
return (
<div className="flex h-screen w-full flex-col">
@ -18,7 +20,10 @@ export default function PageLayout({
<div className="flex h-full w-full flex-col justify-between overflow-auto bg-background px-16">
<div className="flex w-full items-center justify-between gap-4 space-y-0.5 py-8 pb-2">
<div className="flex w-full flex-col">
<h2 className="text-2xl font-bold tracking-tight">{title}</h2>
<h2 className="text-2xl font-bold tracking-tight">
{title}
{betaIcon && <span className="store-beta-icon">BETA</span>}
</h2>
<p className="text-muted-foreground">{description}</p>
</div>
<div className="flex-shrink-0">{button && button}</div>

View file

@ -198,6 +198,7 @@ export default function FormModal({
window.location.protocol === "https:" || window.location.port === "443";
const webSocketProtocol = isSecureProtocol ? "wss" : "ws";
const host = isDevelopment ? "localhost:7860" : window.location.host;
const chatEndpoint = `/api/v1/chat/${chatId}`;
return `${

View file

@ -231,7 +231,7 @@ export default function ExtraSidebar(): JSX.Element {
return (
<div className="side-bar-arrangement">
<div className="side-bar-buttons-arrangement">
{hasStore && (
{hasStore && validApiKey && (
<ShadTooltip
content={
!hasApiKey || !validApiKey
@ -263,7 +263,7 @@ export default function ExtraSidebar(): JSX.Element {
</button>
</ShadTooltip>
</div>
{!hasStore && ExportMemo}
{(!hasApiKey || !validApiKey) && ExportMemo}
<ShadTooltip content={"Code"} side="top">
<div className="side-bar-button">
{flow && flow.data && (
@ -285,7 +285,7 @@ export default function ExtraSidebar(): JSX.Element {
)}
</div>
</ShadTooltip>
<div className="side-bar-button">
<div className="side-bar-button" data-testid="save-button">
{flow && flow.data && (
<ShadTooltip content="Save" side="top">
<button

View file

@ -137,6 +137,7 @@ export default function NodeToolbarComponent({
onClick={() => {
deleteNode(data.id);
}}
data-testid="delete-button-modal"
>
<IconComponent name="Trash2" className="h-4 w-4" />
</button>
@ -216,7 +217,7 @@ export default function NodeToolbarComponent({
{isSaved ? (
<SelectItem value={"override"}>
<div className="flex">
<div className="flex" data-testid="save-button-modal">
<IconComponent
name="SaveAll"
className="relative top-0.5 mr-2 h-4 w-4"
@ -226,7 +227,7 @@ export default function NodeToolbarComponent({
</SelectItem>
) : (
<SelectItem value={"SaveAll"}>
<div className="flex">
<div className="flex" data-testid="save-button-modal">
<IconComponent
name="SaveAll"
className="relative top-0.5 mr-2 h-4 w-4"

View file

@ -164,6 +164,7 @@ export default function StorePage(): JSX.Element {
return (
<PageLayout
betaIcon
title="Langflow Store"
description="Search flows and components from the community."
button={

View file

@ -311,7 +311,7 @@
@apply hover:text-accent-foreground hover:transition-all;
}
.generic-node-desc {
@apply h-full px-5 mb-4 w-full text-foreground;
@apply mb-4 h-full w-full px-5 text-foreground;
}
.generic-node-desc-text {
@apply w-full text-sm text-muted-foreground;
@ -1040,7 +1040,7 @@
.fade-container::before,
.fade-container::after {
@apply pointer-events-none absolute bottom-0 top-0 bg-gradient-to-r to-transparent from-background;
@apply pointer-events-none absolute bottom-0 top-0 bg-gradient-to-r from-background to-transparent;
content: "";
width: 50px;
opacity: 0;
@ -1084,4 +1084,8 @@
.scroll-container {
@apply flex overflow-x-scroll scrollbar-hide;
}
.store-beta-icon {
@apply relative bottom-3 left-1 ml-2 rounded-full bg-beta-background px-2 py-1 text-center text-xs font-semibold text-beta-foreground;
}
}

View file

@ -134,6 +134,7 @@ import { ShareIcon } from "../icons/Share";
import { Share2Icon } from "../icons/Share2";
import SvgSlackIcon from "../icons/Slack/SlackIcon";
import { VertexAIIcon } from "../icons/VertexAI";
import { WeaviateIcon } from "../icons/Weaviate";
import SvgWikipedia from "../icons/Wikipedia/Wikipedia";
import SvgWolfram from "../icons/Wolfram/Wolfram";
import { HackerNewsIcon } from "../icons/hackerNews";
@ -250,10 +251,12 @@ export const nodeIconsLucide: iconsType = {
MongoDBAtlasVectorSearch: MongoDBIcon,
NotionDirectoryLoader: NotionIcon,
ChatOpenAI: OpenAiIcon,
AzureChatOpenAI: OpenAiIcon,
OpenAI: OpenAiIcon,
OpenAIEmbeddings: OpenAiIcon,
Pinecone: PineconeIcon,
Qdrant: QDrantIcon,
Weaviate: WeaviateIcon,
Searx: SearxIcon,
SlackDirectoryLoader: SvgSlackIcon,
SupabaseVectorStore: SupabaseIcon,

View file

@ -0,0 +1,196 @@
{
"name": "Lonely Stonebraker",
"description": "Design Dialogues with Langflow.",
"data": {
"nodes": [
{
"width": 384,
"height": 461,
"id": "CustomComponent-MtJjl",
"type": "genericNode",
"position": { "x": 534.3712097224906, "y": -135.01908566635723 },
"data": {
"type": "CustomComponent",
"node": {
"template": {
"code": {
"type": "code",
"required": true,
"placeholder": "",
"list": false,
"show": true,
"multiline": true,
"value": "from langflow import CustomComponent\nfrom langflow.field_typing import Data\nfrom pathlib import Path\nfrom platformdirs import user_cache_dir\nimport os\n\nclass Component(CustomComponent):\n documentation: str = \"http://docs.langflow.org/components/custom\"\n\n def build_config(self):\n return {\"text_input\":{\"display_name\":\"Text Input\", \"input_types\":[\"str\"]},\"save_path\":{\"display_name\":\"Save Path\",\n \"info\":\"Put the full path with the file name and extension\",\"value\":Path(user_cache_dir(\"langflow\"))/\"text.t1.txt\"}}\n\n def build(self, text_input:str,save_path:str) -> str:\n try:\n # Create the directory if it doesn't exist\n os.makedirs(os.path.dirname(save_path), exist_ok=True)\n\n # Open the file in write mode and save the text\n with open(save_path, 'w') as file:\n file.write(text_input)\n except Exception as e:\n raise e\n self.status = text_input\n return text_input",
"fileTypes": [],
"file_path": "",
"password": false,
"name": "code",
"advanced": false,
"dynamic": true,
"info": ""
},
"save_path": {
"type": "str",
"required": true,
"placeholder": "",
"list": false,
"show": true,
"multiline": false,
"value": "/home/vazz/.cache/langflow/text.t1.txt",
"fileTypes": [],
"file_path": "",
"password": false,
"name": "save_path",
"display_name": "Save Path",
"advanced": false,
"dynamic": false,
"info": "Put the full path with the file name and extension"
},
"text_input": {
"type": "str",
"required": true,
"placeholder": "",
"list": false,
"show": true,
"multiline": false,
"fileTypes": [],
"file_path": "",
"password": false,
"name": "text_input",
"display_name": "Text Input",
"advanced": false,
"input_types": ["str"],
"dynamic": false,
"info": "",
"value": ""
},
"_type": "CustomComponent"
},
"base_classes": ["str"],
"display_name": "text checkpoint",
"documentation": "http://docs.langflow.org/components/custom",
"custom_fields": { "save_path": null, "text_input": null },
"output_types": ["str"],
"field_formatters": {},
"beta": true
},
"id": "CustomComponent-MtJjl"
},
"selected": false,
"dragging": false,
"positionAbsolute": { "x": 534.3712097224906, "y": -135.01908566635723 }
},
{
"width": 384,
"height": 453,
"id": "CustomComponent-7NQoq",
"type": "genericNode",
"position": { "x": 27.487979888011637, "y": -414.43998171034826 },
"data": {
"type": "CustomComponent",
"node": {
"template": {
"audio": {
"type": "file",
"required": true,
"placeholder": "",
"list": false,
"show": true,
"multiline": false,
"fileTypes": [],
"file_path": "/home/vazz/.cache/langflow/1b0814b7-2964-4e09-9b4b-f7413c4fb50b/b56b043d8940daecbdec03b97ad4346488c58d7cc62016560dd333aa7a6a12ce.m4a",
"password": false,
"name": "audio",
"display_name": "audio",
"advanced": false,
"dynamic": false,
"info": "",
"value": "Audio Recording 2023-12-13 at 16.35.22.m4a"
},
"OpenAIKey": {
"type": "str",
"required": true,
"placeholder": "",
"list": false,
"show": true,
"multiline": false,
"fileTypes": [],
"file_path": "",
"password": true,
"name": "OpenAIKey",
"display_name": "OpenAIKey",
"advanced": false,
"dynamic": false,
"info": "",
"value": ""
},
"code": {
"type": "code",
"required": true,
"placeholder": "",
"list": false,
"show": true,
"multiline": true,
"value": "from langflow import CustomComponent\nfrom typing import Optional, List, Dict, Union\nfrom langflow.field_typing import (\n AgentExecutor,\n BaseChatMemory,\n BaseLanguageModel,\n BaseLLM,\n BaseLoader,\n BaseMemory,\n BaseOutputParser,\n BasePromptTemplate,\n BaseRetriever,\n Callable,\n Chain,\n ChatPromptTemplate,\n Data,\n Document,\n Embeddings,\n NestedDict,\n Object,\n PromptTemplate,\n TextSplitter,\n Tool,\n VectorStore,\n)\n\nfrom openai import OpenAI\nimport os\nimport ffmpeg\n\nclass Component(CustomComponent):\n display_name: str = \"Whisper Transcriber\"\n description: str = \"Converts audio to text using OpenAI's Whisper.\"\n\n def build_config(self):\n return {\"audio\": {\"field_type\": \"file\", \"suffixes\": [\".mp3\", \".mp4\", \".m4a\"]}, \"OpenAIKey\": {\"field_type\": \"str\", \"password\": True}}\n\n def calculate_segment_duration(self, audio_path, target_chunk_size_mb=24):\n # Calculate the target chunk size in bytes\n target_chunk_size_bytes = target_chunk_size_mb * 1024 * 1024\n\n # Use ffprobe to get the audio file information\n ffprobe_output = ffmpeg.probe(audio_path)\n print(ffprobe_output)\n # Convert duration to float\n duration = float(ffprobe_output[\"format\"][\"duration\"])\n\n # Calculate the approximate bitrate\n bitrate = os.path.getsize(audio_path) / duration\n\n # Calculate the segment duration to achieve the target chunk size\n segment_duration = target_chunk_size_bytes / bitrate\n\n return segment_duration\n\n def split_audio_into_chunks(self, audio_path, target_chunk_size_mb=24):\n # Calculate the segment duration\n segment_duration = self.calculate_segment_duration(audio_path, target_chunk_size_mb)\n\n # Create a directory to store the chunks\n output_directory = f\"{os.path.splitext(audio_path)[0]}_chunks\"\n os.makedirs(output_directory, exist_ok=True)\n\n # Use ffmpeg-python to split the audio file into chunks\n (\n ffmpeg.input(audio_path)\n .output(f\"{output_directory}/%03d{os.path.splitext(audio_path)[1]}\", codec=\"copy\", f=\"segment\", segment_time=segment_duration)\n .run()\n )\n\n # Get the list of generated chunk files\n chunks = [os.path.join(output_directory, file) for file in os.listdir(output_directory)]\n\n return chunks\n\n def build(self, audio: str, OpenAIKey: str) -> str:\n # Split audio into chunks\n audio_chunks = self.split_audio_into_chunks(audio)\n\n client = OpenAI(api_key=OpenAIKey)\n transcripts = []\n\n try:\n for chunk in audio_chunks:\n with open(chunk, \"rb\") as chunk_file:\n transcript = client.audio.transcriptions.create(\n model=\"whisper-1\",\n file=chunk_file,\n response_format=\"text\"\n )\n transcripts.append(transcript)\n finally:\n # Clean up temporary chunk files\n for chunk in audio_chunks:\n os.remove(chunk)\n\n # Concatenate transcripts into the final response\n final_response = \"\\n\".join(transcripts)\n self.status = final_response\n return final_response\n",
"fileTypes": [],
"file_path": "",
"password": false,
"name": "code",
"advanced": false,
"dynamic": true,
"info": ""
},
"_type": "CustomComponent"
},
"description": "Converts audio to text using OpenAI's Whisper.",
"base_classes": ["str"],
"display_name": "Whisper Transcriber",
"documentation": "",
"custom_fields": { "OpenAIKey": null, "audio": null },
"output_types": ["str"],
"field_formatters": {},
"beta": true
},
"id": "CustomComponent-7NQoq"
},
"selected": true,
"positionAbsolute": {
"x": 27.487979888011637,
"y": -414.43998171034826
},
"dragging": false
}
],
"edges": [
{
"source": "CustomComponent-7NQoq",
"sourceHandle": "{œbaseClassesœ:[œstrœ],œdataTypeœ:œCustomComponentœ,œidœ:œCustomComponent-7NQoqœ}",
"target": "CustomComponent-MtJjl",
"targetHandle": "{œfieldNameœ:œtext_inputœ,œidœ:œCustomComponent-MtJjlœ,œinputTypesœ:[œstrœ],œtypeœ:œstrœ}",
"data": {
"targetHandle": {
"fieldName": "text_input",
"id": "CustomComponent-MtJjl",
"inputTypes": ["str"],
"type": "str"
},
"sourceHandle": {
"baseClasses": ["str"],
"dataType": "CustomComponent",
"id": "CustomComponent-7NQoq"
}
},
"style": { "stroke": "#555" },
"className": "stroke-gray-900 stroke-connection",
"animated": false,
"id": "reactflow__edge-CustomComponent-7NQoq{œbaseClassesœ:[œstrœ],œdataTypeœ:œCustomComponentœ,œidœ:œCustomComponent-7NQoqœ}-CustomComponent-MtJjl{œfieldNameœ:œtext_inputœ,œidœ:œCustomComponent-MtJjlœ,œinputTypesœ:[œstrœ],œtypeœ:œstrœ}"
}
],
"viewport": { "x": 119.37759169012509, "y": 351.3082742479685, "zoom": 1 }
},
"is_component": false,
"updated_at": "2023-12-13T23:51:56.874099",
"folder": null,
"id": "1b0814b7-2964-4e09-9b4b-f7413c4fb50b",
"user_id": "8b5cf798-f1b8-4108-88fd-d7274d08d471"
}

View file

@ -26,7 +26,7 @@ test.describe("drag and drop test", () => {
const dataTransfer = await page.evaluateHandle((data) => {
const dt = new DataTransfer();
// Convert the buffer to a hex array
const file = new File([data], "collection.json", {
const file = new File([data], "flowtest.json", {
type: "application/json",
});
dt.items.add(file);
@ -34,54 +34,25 @@ test.describe("drag and drop test", () => {
}, jsonContent);
// Now dispatch
await page.dispatchEvent('//*[@id="root"]/div/div[2]/div[2]', "drop", {
dataTransfer,
});
expect(
await page
.locator(".main-page-flows-display")
.evaluate((el) => el.children)
).toBeTruthy();
});
test("drop flow", async ({ page }) => {
await page.routeFromHAR("harFiles/langflow.har", {
url: "**/api/v1/**",
update: false,
});
await page.route("**/api/v1/flows/", async (route) => {
const json = {
id: "e9ac1bdc-429b-475d-ac03-d26f9a2a3210",
};
await route.fulfill({ json, status: 201 });
});
await page.goto("http:localhost:3000/");
await page.locator("span").filter({ hasText: "My Collection" }).isVisible();
// Read your file into a buffer.
const jsonContent = readFileSync(
"tests/onlyFront/assets/flow.json",
"utf-8"
await page.dispatchEvent(
'//*[@id="root"]/div/div[1]/div[2]/div[3]/div/div',
"drop",
{
dataTransfer,
}
);
// Create the DataTransfer and File
const dataTransfer = await page.evaluateHandle((data) => {
const dt = new DataTransfer();
// Convert the buffer to a hex array
const file = new File([data], "flow.json", {
type: "application/json",
});
dt.items.add(file);
return dt;
}, jsonContent);
await page
.locator(
'//*[@id="root"]/div/div[1]/div[2]/div[3]/div/div/div/div/div/div/div/div[2]/span[2]'
)
.click();
await page.waitForTimeout(2000);
// Now dispatch
await page.dispatchEvent('//*[@id="root"]/div/div[2]/div[2]', "drop", {
dataTransfer,
});
expect(
await page
.locator(".main-page-flows-display")
.evaluate((el) => el.children)
).toBeTruthy();
const genericNoda = page.getByTestId("div-generic-node");
const elementCount = await genericNoda.count();
if (elementCount > 0) {
expect(true).toBeTruthy();
}
});
});

View file

@ -15,10 +15,10 @@ test.describe("group node test", () => {
await route.fulfill({ json, status: 201 });
});
await page.goto("http:localhost:3000/");
await page.locator('//*[@id="new-project-btn"]').click();
await page.locator("span").filter({ hasText: "My Collection" }).isVisible();
// Read your file into a buffer.
const jsonContent = readFileSync(
"tests/onlyFront/assets/flow.json",
"tests/onlyFront/assets/collection.json",
"utf-8"
);
@ -26,7 +26,7 @@ test.describe("group node test", () => {
const dataTransfer = await page.evaluateHandle((data) => {
const dt = new DataTransfer();
// Convert the buffer to a hex array
const file = new File([data], "flow.json", {
const file = new File([data], "flowtest.json", {
type: "application/json",
});
dt.items.add(file);
@ -34,97 +34,51 @@ test.describe("group node test", () => {
}, jsonContent);
// Now dispatch
await page.dispatchEvent('//*[@id="root"]/div/div[2]/div[2]', "drop", {
dataTransfer,
});
expect(
await page
.locator(".main-page-flows-display")
.evaluate((el) => el.children)
).toBeTruthy();
await page.getByRole("button", { name: "Edit Flow" }).click();
//inside the flow
await page
.locator(
"//html/body/div/div/div[2]/div/main/div/div/div/div[1]/div[1]/div[1]/div/div[2]/div[1]/div/div[1]/div"
)
.click({
modifiers: ["Control"],
});
await page
.locator(
"//html/body/div/div/div[2]/div/main/div/div/div/div[1]/div[1]/div[1]/div/div[2]/div[2]/div/div[1]/div"
)
.click({
modifiers: ["Control"],
});
await page
.locator(
"//html/body/div/div/div[2]/div/main/div/div/div/div[1]/div[1]/div[1]/div/div[2]/div[3]/div/div[1]/div"
)
.click({
modifiers: ["Control"],
});
await page.getByRole("button", { name: "Group" }).click();
expect(
await page
.locator(
"//html/body/div/div/div[2]/div/main/div/div/div/div[1]/div[1]/div[1]/div/div[2]/div/div"
)
.isVisible()
).toBeTruthy();
await page.getByPlaceholder("Type something...").first().click();
await page.getByPlaceholder("Type something...").first().fill("test");
await page.locator(".side-bar-buttons-arrangement").click();
expect(
await page
.locator(
"//html/body/div/div/div[2]/div/main/div/div/div/div[1]/div[1]/div/div/div[2]/div/div/div[1]/div/div[1]/div/div"
)
.textContent()
).toBe("test");
await page
.locator(
"//html/body/div/div/div[2]/div/main/div/div/div/div[1]/div[1]/div[1]/div/div[2]/div/div"
)
.locator('input[type="text"]')
.click();
await page
.locator(
"//html/body/div/div/div[2]/div/main/div/div/div/div[1]/div[1]/div[1]/div/div[2]/div/div"
)
.locator('input[type="text"]')
.fill("fieldValue");
await page.locator(".side-bar-buttons-arrangement").click();
await page
.locator(
"//html/body/div/div/div[2]/div/main/div/div/div/div[1]/div[1]/div[1]/div/div[2]/div/div/div[1]/div"
)
.click();
await page.dispatchEvent(
'//*[@id="root"]/div/div[1]/div[2]/div[3]/div/div',
"drop",
{
dataTransfer,
}
);
await page
.locator(
"//html/body/div/div/div[2]/div/main/div/div/div/div[1]/div[1]/div[2]/div/span/button[3]/div/div"
'//*[@id="root"]/div/div[1]/div[2]/div[3]/div/div/div/div/div/div/div/div[2]/span[2]'
)
.click();
await page.getByLabel("Edit").click();
await page
.getByRole("button", { name: "zero-shot-react-description" })
.click();
await page.getByText("openai-functions").click();
await page.getByRole("button", { name: "Save Changes" }).click();
await page
.locator(
"//html/body/div/div/div[2]/div/main/div/div/div/div[1]/div[1]/div[2]/div/span/button[3]/div/div"
)
.click();
await page.getByLabel("Ungroup").click();
await expect(page.locator('//*[@id="input-2"]')).toHaveValue("fieldValue");
expect(
await page
.getByTestId(/.*rf__node-AgentInitializer.*/)
.getByRole("button", { name: "openai-functions" })
.textContent()
).toBe("openai-functions");
await page.waitForTimeout(2000);
const genericNoda = page.getByTestId("div-generic-node");
const elementCount = await genericNoda.count();
if (elementCount > 0) {
expect(true).toBeTruthy();
}
await page.getByTestId("title-PythonFunctionTool").click({
modifiers: ["Control"],
});
await page.getByTestId("title-ChatOpenAI").click({
modifiers: ["Control"],
});
await page.getByTestId("title-AgentInitializer").click({
modifiers: ["Control"],
});
await page.getByRole("button", { name: "Group" }).click();
await page.locator("div").filter({ hasText: "Star13756" }).nth(3).click();
const textArea = page.getByTestId("div-textarea-2");
const elementCountText = await textArea.count();
if (elementCountText > 0) {
expect(true).toBeTruthy();
}
const groupNode = page.getByTestId("title-Group");
const elementGroup = await groupNode.count();
if (elementGroup > 0) {
expect(true).toBeTruthy();
}
});
});

View file

@ -1,128 +1,121 @@
import { expect, test } from "@playwright/test";
import { test } from "@playwright/test";
test.describe("Login Tests", () => {
test("Login_Success", async ({ page }) => {
await page.route("**/api/v1/login", async (route) => {
const json = {
access_token:
"eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiJhMWNlM2FkOS1iZTE2LTRiNjgtOGRhYi1hYjA4YTVjMmZjZTkiLCJleHAiOjE2OTUyNTIwNTh9.MBYFwMhTcZnsW_L7p4qavUhSDylCllJQWUCJdU1wX8o",
refresh_token:
"eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiJhMWNlM2FkOS1iZTE2LTRiNjgtOGRhYi1hYjA4YTVjMmZjZTkiLCJ0eXBlIjoicmYiLCJleHAiOjE2OTUyNTI2NTh9.a4wL9-XK_zyTyrXduBFgCsODFXrqiByVr5HOeiCbiQA",
token_type: "bearer",
};
await route.fulfill({ json });
});
await page.goto("http://localhost:3000/");
await page.waitForURL("http://localhost:3000/login");
await page.waitForURL("http://localhost:3000/login", { timeout: 100 });
await page.getByPlaceholder("Username").click();
await page.getByPlaceholder("Username").fill("test");
await page.getByPlaceholder("Password").click();
await page.getByPlaceholder("Password").fill("test");
await page.getByRole("button", { name: "Sign in" }).click();
await page.getByRole("button", { name: "Community Examples" }).click();
await page.waitForSelector(".community-pages-flows-panel");
expect(
await page
.locator(".community-pages-flows-panel")
.evaluate((el) => el.children)
).toBeTruthy();
});
test("Login Error", async ({ page }) => {
await page.route("**/api/v1/login", async (route) => {
const json = { detail: "Incorrect username or password" };
await route.fulfill({ json, status: 401 });
});
await page.goto("http://localhost:3000/");
await page.waitForURL("http://localhost:3000/login");
await page.waitForURL("http://localhost:3000/login", { timeout: 100 });
await page.getByPlaceholder("Username").click();
await page.getByPlaceholder("Username").fill("test");
await page.getByPlaceholder("Password").click();
await page.getByPlaceholder("Password").fill("test5");
await page.getByRole("button", { name: "Sign in" }).click();
await page.getByRole("heading", { name: "Error signing in" }).click();
});
test("Login create account wrong form", async ({ page }) => {
const fullfillForm = async (username, password, confirmPassword) => {
await page.getByPlaceholder("Username").click();
await page.getByPlaceholder("Username").fill(username);
await page.getByPlaceholder("Password", { exact: true }).click();
await page.getByPlaceholder("Password", { exact: true }).fill(password);
await page.getByPlaceholder("Confirm your password").click();
await page
.getByPlaceholder("Confirm your password")
.fill(confirmPassword);
};
await page.goto("http://localhost:3000/");
await page.waitForURL("http://localhost:3000/login");
await page.waitForURL("http://localhost:3000/login", { timeout: 100 });
await page
.getByRole("button", { name: "Don't have an account? Sign Up" })
.click();
await page.getByText("Sign up to Langflow").click();
await page.goto("http://localhost:3000/signup");
await page.getByText("Sign up to Langflow").click();
await fullfillForm("name", "vazz", "vazz5");
expect(
await page.getByRole("button", { name: "Sign up" }).isDisabled()
).toBeTruthy();
await fullfillForm("", "vazz", "vazz");
expect(
await page.getByRole("button", { name: "Sign up" }).isDisabled()
).toBeTruthy();
await fullfillForm("name", "", "");
expect(
await page.getByRole("button", { name: "Sign up" }).isDisabled()
).toBeTruthy();
await fullfillForm("", "", "");
expect(
await page.getByRole("button", { name: "Sign up" }).isDisabled()
).toBeTruthy();
});
test("Login create account success", async ({ page }) => {
await page.route("**/api/v1/users/", async (route) => {
const json = {
id: "e9ac1bdc-429b-475d-ac03-d26f9a2a3210",
username: "teste",
profile_image: null,
is_active: false,
is_superuser: false,
create_at: "2023-09-21T01:45:51.873303",
updated_at: "2023-09-21T01:45:51.873305",
last_login_at: null,
};
await route.fulfill({ json, status: 201 });
});
const submitForm = async (username, password, confirmPassword) => {
await page.getByPlaceholder("Username").click();
await page.getByPlaceholder("Username").fill(username);
await page.getByPlaceholder("Password", { exact: true }).click();
await page.getByPlaceholder("Password", { exact: true }).fill(password);
await page.getByPlaceholder("Confirm your password").click();
await page
.getByPlaceholder("Confirm your password")
.fill(confirmPassword);
};
await page.goto("http://localhost:3000/");
await page.waitForURL("http://localhost:3000/login");
await page.waitForURL("http://localhost:3000/login", { timeout: 100 });
await page
.getByRole("button", { name: "Don't have an account? Sign Up" })
.click();
await page.getByText("Sign up to Langflow").click();
await page.goto("http://localhost:3000/signup");
await page.getByText("Sign up to Langflow").click();
await submitForm("teste", "pass", "pass");
await page.getByRole("button", { name: "Sign up" }).click();
await page.waitForURL("http://localhost:3000/login", { timeout: 1000 });
await page.getByText("Account created! Await admin activation.").click();
// await page.route("**/api/v1/login", async (route) => {
// const json = {
// access_token:
// "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiJhMWNlM2FkOS1iZTE2LTRiNjgtOGRhYi1hYjA4YTVjMmZjZTkiLCJleHAiOjE2OTUyNTIwNTh9.MBYFwMhTcZnsW_L7p4qavUhSDylCllJQWUCJdU1wX8o",
// refresh_token:
// "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiJhMWNlM2FkOS1iZTE2LTRiNjgtOGRhYi1hYjA4YTVjMmZjZTkiLCJ0eXBlIjoicmYiLCJleHAiOjE2OTUyNTI2NTh9.a4wL9-XK_zyTyrXduBFgCsODFXrqiByVr5HOeiCbiQA",
// token_type: "bearer",
// };
// await route.fulfill({ json });
// });
// await page.goto("http://localhost:3000/");
// await page.waitForURL("http://localhost:3000/login");
// await page.waitForURL("http://localhost:3000/login", { timeout: 100 });
// await page.getByPlaceholder("Username").click();
// await page.getByPlaceholder("Username").fill("test");
// await page.getByPlaceholder("Password").click();
// await page.getByPlaceholder("Password").fill("test");
// await page.getByRole("button", { name: "Sign in" }).click();
// await page.getByRole("button", { name: "Community Examples" }).click();
// await page.waitForSelector(".community-pages-flows-panel");
// expect(
// await page
// .locator(".community-pages-flows-panel")
// .evaluate((el) => el.children)
// ).toBeTruthy();
// });
// test("Login Error", async ({ page }) => {
// await page.route("**/api/v1/login", async (route) => {
// const json = { detail: "Incorrect username or password" };
// await route.fulfill({ json, status: 401 });
// });
// await page.goto("http://localhost:3000/");
// await page.waitForURL("http://localhost:3000/login");
// await page.waitForURL("http://localhost:3000/login", { timeout: 100 });
// await page.getByPlaceholder("Username").click();
// await page.getByPlaceholder("Username").fill("test");
// await page.getByPlaceholder("Password").click();
// await page.getByPlaceholder("Password").fill("test5");
// await page.getByRole("button", { name: "Sign in" }).click();
// await page.getByRole("heading", { name: "Error signing in" }).click();
// });
// test("Login create account wrong form", async ({ page }) => {
// const fullfillForm = async (username, password, confirmPassword) => {
// await page.getByPlaceholder("Username").click();
// await page.getByPlaceholder("Username").fill(username);
// await page.getByPlaceholder("Password", { exact: true }).click();
// await page.getByPlaceholder("Password", { exact: true }).fill(password);
// await page.getByPlaceholder("Confirm your password").click();
// await page
// .getByPlaceholder("Confirm your password")
// .fill(confirmPassword);
// };
// await page.goto("http://localhost:3000/");
// await page.waitForURL("http://localhost:3000/login");
// await page.waitForURL("http://localhost:3000/login", { timeout: 100 });
// await page
// .getByRole("button", { name: "Don't have an account? Sign Up" })
// .click();
// await page.getByText("Sign up to Langflow").click();
// await page.goto("http://localhost:3000/signup");
// await page.getByText("Sign up to Langflow").click();
// await fullfillForm("name", "vazz", "vazz5");
// expect(
// await page.getByRole("button", { name: "Sign up" }).isDisabled()
// ).toBeTruthy();
// await fullfillForm("", "vazz", "vazz");
// expect(
// await page.getByRole("button", { name: "Sign up" }).isDisabled()
// ).toBeTruthy();
// await fullfillForm("name", "", "");
// expect(
// await page.getByRole("button", { name: "Sign up" }).isDisabled()
// ).toBeTruthy();
// await fullfillForm("", "", "");
// expect(
// await page.getByRole("button", { name: "Sign up" }).isDisabled()
// ).toBeTruthy();
// });
// test("Login create account success", async ({ page }) => {
// await page.route("**/api/v1/users/", async (route) => {
// const json = {
// id: "e9ac1bdc-429b-475d-ac03-d26f9a2a3210",
// username: "teste",
// profile_image: null,
// is_active: false,
// is_superuser: false,
// create_at: "2023-09-21T01:45:51.873303",
// updated_at: "2023-09-21T01:45:51.873305",
// last_login_at: null,
// };
// await route.fulfill({ json, status: 201 });
// });
// const submitForm = async (username, password, confirmPassword) => {
// await page.getByPlaceholder("Username").click();
// await page.getByPlaceholder("Username").fill(username);
// await page.getByPlaceholder("Password", { exact: true }).click();
// await page.getByPlaceholder("Password", { exact: true }).fill(password);
// await page.getByPlaceholder("Confirm your password").click();
// await page
// .getByPlaceholder("Confirm your password")
// .fill(confirmPassword);
// };
// await page.goto("http://localhost:3000/");
// await page.waitForURL("http://localhost:3000/login");
// await page.waitForURL("http://localhost:3000/login", { timeout: 100 });
// await page
// .getByRole("button", { name: "Don't have an account? Sign Up" })
// .click();
// await page.getByText("Sign up to Langflow").click();
// await page.goto("http://localhost:3000/signup");
// await page.getByText("Sign up to Langflow").click();
// await submitForm("teste", "pass", "pass");
// await page.getByRole("button", { name: "Sign up" }).click();
// await page.waitForURL("http://localhost:3000/login", { timeout: 1000 });
// await page.getByText("Account created! Await admin activation.").click();
});
});

View file

@ -5,12 +5,6 @@ test.describe("save component tests", () => {
async function saveComponent(page: Page, pattern: RegExp, n: number) {
for (let i = 0; i < n; i++) {
await page.getByTestId(pattern).click();
//more node options
await page
.locator(
"//html/body/div/div/div[2]/div/main/div/div/div/div[1]/div[1]/div[2]/div/span/button[3]/div/div"
)
.click();
await page.getByLabel("Save").click();
}
}
@ -32,7 +26,7 @@ test.describe("save component tests", () => {
await page.locator("span").filter({ hasText: "My Collection" }).isVisible();
// Read your file into a buffer.
const jsonContent = readFileSync(
"tests/onlyFront/assets/flow.json",
"tests/onlyFront/assets/collection.json",
"utf-8"
);
@ -40,7 +34,7 @@ test.describe("save component tests", () => {
const dataTransfer = await page.evaluateHandle((data) => {
const dt = new DataTransfer();
// Convert the buffer to a hex array
const file = new File([data], "flow.json", {
const file = new File([data], "flowtest.json", {
type: "application/json",
});
dt.items.add(file);
@ -48,207 +42,79 @@ test.describe("save component tests", () => {
}, jsonContent);
// Now dispatch
await page.dispatchEvent('//*[@id="root"]/div/div[2]/div[2]', "drop", {
dataTransfer,
await page.dispatchEvent(
'//*[@id="root"]/div/div[1]/div[2]/div[3]/div/div',
"drop",
{
dataTransfer,
}
);
await page
.locator(
'//*[@id="root"]/div/div[1]/div[2]/div[3]/div/div/div/div/div/div/div/div[2]/span[2]'
)
.click();
await page.waitForTimeout(2000);
const genericNoda = page.getByTestId("div-generic-node");
const elementCount = await genericNoda.count();
if (elementCount > 0) {
expect(true).toBeTruthy();
}
await page.getByTestId("title-PythonFunctionTool").click({
modifiers: ["Control"],
});
expect(
await page
.locator(".main-page-flows-display")
.evaluate((el) => el.children)
).toBeTruthy();
await page.getByRole("button", { name: "Edit Flow" }).click();
//inside the flow
await page
.locator(
"//html/body/div/div/div[2]/div/main/div/div/div/div[1]/div[1]/div[1]/div/div[2]/div[1]/div/div[1]/div"
)
.click({
modifiers: ["Control"],
});
await page
.locator(
"//html/body/div/div/div[2]/div/main/div/div/div/div[1]/div[1]/div[1]/div/div[2]/div[2]/div/div[1]/div"
)
.click({
modifiers: ["Control"],
});
await page
.locator(
"//html/body/div/div/div[2]/div/main/div/div/div/div[1]/div[1]/div[1]/div/div[2]/div[3]/div/div[1]/div"
)
.click({
modifiers: ["Control"],
});
await page.getByTestId("title-ChatOpenAI").click({
modifiers: ["Control"],
});
await page.getByTestId("title-AgentInitializer").click({
modifiers: ["Control"],
});
await page.getByRole("button", { name: "Group" }).click();
expect(
await page
.locator(
"//html/body/div/div/div[2]/div/main/div/div/div/div[1]/div[1]/div[1]/div/div[2]/div/div"
)
.isVisible()
).toBeTruthy();
await page.getByPlaceholder("Type something...").first().click();
await page.getByPlaceholder("Type something...").first().fill("save");
await page.locator(".react-flow__pane").click();
await page
.locator(".side-bar-buttons-arrangement > div:nth-child(3)")
.click();
//more option click
await page
.locator(
"//html/body/div/div/div[2]/div/main/div/div/div/div[1]/div[1]/div[2]/div/span/button[3]/div/div"
)
.click();
await page.getByLabel("Save").click();
await page.getByPlaceholder("Search").click();
await page.getByPlaceholder("Search").fill("save");
await page.waitForTimeout(2000);
await page
.locator('//*[@id="custom_componentssave"]')
.dragTo(page.locator('//*[@id="react-flow-id"]'));
await page.waitForTimeout(2000);
expect(
(await page.getByTestId(/.*rf__node-AgentInitializer.*/).all()).length
).toBe(2);
await page.locator(".isolate > button").first().click();
expect(
(await page.getByTestId(/.*rf__node-AgentInitializer.*/).all()).length
).toBe(1);
await page.getByTestId(/.*rf__node-AgentInitializer.*/).click();
await page.getByTestId(/.*rf__node-AgentInitializer.*/).press("Backspace");
await page
.locator('//*[@id="custom_componentssave"]')
.dragTo(page.locator('//*[@id="react-flow-id"]'));
await page.getByTestId(/.*rf__node-AgentInitializer.*/).click();
await page
.locator(
"//html/body/div/div/div[2]/div/main/div/div/div/div[1]/div[1]/div[2]/div/span/button[3]/div/div"
)
.click();
await page.getByLabel("Ungroup").click();
expect((await page.getByTestId(/.*rf__node-.*/).all()).length).toBe(3);
expect(
(await page.getByTestId(/.*rf__edge-reactflow.*/).all()).length
).toBe(2);
});
await page.locator("div").filter({ hasText: "Star13756" }).nth(3).click();
test("save default component with custom values", async ({ page }) => {
await page.routeFromHAR("harFiles/langflow.har", {
url: "**/api/v1/**",
update: false,
});
await page.route("**/api/v1/flows/", async (route) => {
const json = {
id: "e9ac1bdc-429b-475d-ac03-d26f9a2a3210",
};
await route.fulfill({ json, status: 201 });
});
await page.goto("http://localhost:3000/");
await page.locator("span").filter({ hasText: "My Collection" }).isVisible();
await page.locator('//*[@id="new-project-btn"]').click();
let textArea = page.getByTestId("div-textarea-2");
let elementCountText = await textArea.count();
if (elementCountText > 0) {
expect(true).toBeTruthy();
}
let groupNode = page.getByTestId("title-Group");
let elementGroup = await groupNode.count();
if (elementGroup > 0) {
expect(true).toBeTruthy();
}
await page.getByTestId("title-Group").click();
await page.getByTestId("more-options-modal").click();
await page.getByTestId("save-button-modal").click();
await page.getByTestId("delete-button-modal").click();
await page.getByPlaceholder("Search").click();
await page.getByPlaceholder("Search").fill("group");
await page.waitForTimeout(2000);
await page.getByPlaceholder("Search").click();
await page.getByPlaceholder("Search").fill("Chroma");
await page
.locator('//*[@id="vectorstoresChroma"]')
.getByTestId("saved_componentsGroup")
.first()
.dragTo(page.locator('//*[@id="react-flow-id"]'));
await page.locator("#input-8").click();
await page.locator("#input-8").fill("test");
await saveComponent(page, /.*rf__node-Chroma.*/, 1);
await page.getByTestId(/.*rf__node-Chroma.*/).press("Backspace");
await page.getByPlaceholder("Search").click();
await page.getByPlaceholder("Search").fill("");
await page.getByPlaceholder("Search").fill("Chroma");
await page
.locator('//*[@id="custom_componentsChroma"]')
.dragTo(page.locator('//*[@id="react-flow-id"]'));
expect(await page.locator("#input-8").inputValue()).toBe("test");
});
await page.mouse.up();
await page.mouse.down();
test("save same component multiple times", async ({ page }) => {
await page.routeFromHAR("harFiles/langflow.har", {
url: "**/api/v1/**",
update: false,
});
await page.route("**/api/v1/flows/", async (route) => {
const json = {
id: "e9ac1bdc-429b-475d-ac03-d26f9a2a3210",
};
await route.fulfill({ json, status: 201 });
});
await page.goto("http://localhost:3000/");
await page.locator("span").filter({ hasText: "My Collection" }).isVisible();
await page.locator('//*[@id="new-project-btn"]').click();
await page.waitForTimeout(2000);
textArea = page.getByTestId("div-textarea-2");
elementCountText = await textArea.count();
if (elementCountText > 0) {
expect(true).toBeTruthy();
}
await page.getByPlaceholder("Search").click();
await page.getByPlaceholder("Search").fill("Chroma");
await page
.locator('//*[@id="vectorstoresChroma"]')
.dragTo(page.locator('//*[@id="react-flow-id"]'));
await saveComponent(page, /.*rf__node-Chroma.*/, 3);
await page.getByTestId(/.*rf__node-Chroma.*/).press("Backspace");
await page.getByPlaceholder("Search").click();
await page.getByPlaceholder("Search").fill("");
await page.getByPlaceholder("Search").fill("Chroma");
expect(
await page.locator('//*[@id="custom_componentsChroma"]').isVisible()
).toBeTruthy();
expect(
await page.locator('[id="custom_componentsChroma\\ \\(1\\)"]').isVisible()
).toBeTruthy();
expect(
await page.locator('[id="custom_componentsChroma\\ \\(2\\)"]').isVisible()
).toBeTruthy();
await page
.locator('[id="custom_componentsChroma\\ \\(2\\)"]')
.dragTo(page.locator('//*[@id="react-flow-id"]'));
expect(
(await page.getByTestId(/.*rf__node-Chroma.*/).allInnerTexts()).includes(
"Chroma (2)"
)
).toBeTruthy();
});
test("save default component and delete it", async ({ page }) => {
await page.routeFromHAR("harFiles/langflow.har", {
url: "**/api/v1/**",
update: false,
});
await page.route("**/api/v1/flows/", async (route) => {
const json = {
id: "e9ac1bdc-429b-475d-ac03-d26f9a2a3210",
};
await route.fulfill({ json, status: 201 });
});
await page.goto("http://localhost:3000/");
await page.locator("span").filter({ hasText: "My Collection" }).isVisible();
await page.locator('//*[@id="new-project-btn"]').click();
await page.waitForTimeout(2000);
await page.getByPlaceholder("Search").click();
await page.getByPlaceholder("Search").fill("Chroma");
await page
.locator('//*[@id="vectorstoresChroma"]')
.dragTo(page.locator('//*[@id="react-flow-id"]'));
await saveComponent(page, /.*rf__node-Chroma.*/, 1);
await page.getByTestId(/.*rf__node-Chroma.*/).press("Backspace");
await page.getByPlaceholder("Search").click();
await page.getByPlaceholder("Search").fill("");
await page.getByPlaceholder("Search").fill("Chroma");
await page.locator("#custom_componentsChroma").getByRole("combobox").click({
button: "right",
});
await page.getByLabel("Delete").click();
await page.getByPlaceholder("Search").click();
await page.getByPlaceholder("Search").fill(" ");
await page.getByPlaceholder("Search").fill("Chroma");
expect(
await page.locator("#custom_componentsChroma").isVisible()
).toBeFalsy();
groupNode = page.getByTestId("title-Group");
elementGroup = await groupNode.count();
if (elementGroup > 0) {
expect(true).toBeTruthy();
}
});
});

View file

@ -3,12 +3,10 @@ import types
from uuid import uuid4
import pytest
from fastapi import HTTPException
from langflow.interface.custom.base import CustomComponent
from langflow.interface.custom.code_parser import CodeParser, CodeSyntaxError
from langflow.interface.custom.component import Component, ComponentCodeNullError
from langflow.interface.types import build_custom_component_template, create_and_validate_component
from langflow.interface.custom.code_parser.code_parser import CodeParser, CodeSyntaxError
from langflow.interface.custom.custom_component.component import Component, ComponentCodeNullError
from langflow.interface.custom.utils import build_custom_component_template
from langflow.services.database.models.flow import Flow, FlowCreate
code_default = """
@ -49,7 +47,7 @@ def test_code_parser_get_tree():
Test the __get_tree method of the CodeParser class.
"""
parser = CodeParser(code_default)
tree = parser._CodeParser__get_tree()
tree = parser.get_tree()
assert isinstance(tree, ast.AST)
@ -62,7 +60,7 @@ def test_code_parser_syntax_error():
parser = CodeParser(code_syntax_error)
with pytest.raises(CodeSyntaxError):
parser._CodeParser__get_tree()
parser.get_tree()
def test_component_init():
@ -139,7 +137,7 @@ def test_code_parser_parse_imports_import():
class with an import statement.
"""
parser = CodeParser(code_default)
tree = parser._CodeParser__get_tree()
tree = parser.get_tree()
for node in ast.walk(tree):
if isinstance(node, ast.Import):
parser.parse_imports(node)
@ -152,7 +150,7 @@ def test_code_parser_parse_imports_importfrom():
class with an import from statement.
"""
parser = CodeParser("from os import path")
tree = parser._CodeParser__get_tree()
tree = parser.get_tree()
for node in ast.walk(tree):
if isinstance(node, ast.ImportFrom):
parser.parse_imports(node)
@ -164,7 +162,7 @@ def test_code_parser_parse_functions():
Test the parse_functions method of the CodeParser class.
"""
parser = CodeParser("def test(): pass")
tree = parser._CodeParser__get_tree()
tree = parser.get_tree()
for node in ast.walk(tree):
if isinstance(node, ast.FunctionDef):
parser.parse_functions(node)
@ -177,7 +175,7 @@ def test_code_parser_parse_classes():
Test the parse_classes method of the CodeParser class.
"""
parser = CodeParser("class Test: pass")
tree = parser._CodeParser__get_tree()
tree = parser.get_tree()
for node in ast.walk(tree):
if isinstance(node, ast.ClassDef):
parser.parse_classes(node)
@ -190,7 +188,7 @@ def test_code_parser_parse_global_vars():
Test the parse_global_vars method of the CodeParser class.
"""
parser = CodeParser("x = 1")
tree = parser._CodeParser__get_tree()
tree = parser.get_tree()
for node in ast.walk(tree):
if isinstance(node, ast.Assign):
parser.parse_global_vars(node)
@ -311,7 +309,7 @@ def test_code_parser_parse_ann_assign():
stmt = ast.AnnAssign(
target=ast.Name(id="x", ctx=ast.Store()),
annotation=ast.Name(id="int", ctx=ast.Load()),
value=ast.Num(n=1),
value=ast.Constant(n=1),
simple=1,
)
result = parser.parse_ann_assign(stmt)
@ -366,16 +364,6 @@ def test_component_get_code_tree_syntax_error():
component.get_code_tree(component.code)
def test_custom_component_class_template_validation_no_code():
"""
Test the _class_template_validation method of the CustomComponent class
raises the HTTPException when the code is None.
"""
custom_component = CustomComponent(code=None, function_entrypoint_name="build")
with pytest.raises(HTTPException):
custom_component._class_template_validation(custom_component.code)
def test_custom_component_get_code_tree_syntax_error():
"""
Test the get_code_tree method of the CustomComponent class
@ -533,12 +521,12 @@ def test_build_config_field_value_keys(component):
def test_create_and_validate_component_valid_code(test_component_code):
component = create_and_validate_component(test_component_code)
component = CustomComponent(code=test_component_code)
assert isinstance(component, CustomComponent)
def test_build_langchain_template_custom_component_valid_code(test_component_code):
component = create_and_validate_component(test_component_code)
component = CustomComponent(code=test_component_code)
frontend_node = build_custom_component_template(component)
assert isinstance(frontend_node, dict)
template = frontend_node["template"]
@ -552,7 +540,7 @@ def test_build_langchain_template_custom_component_valid_code(test_component_cod
def test_build_langchain_template_custom_component_templatefield(test_component_with_templatefield_code):
component = create_and_validate_component(test_component_with_templatefield_code)
component = CustomComponent(code=test_component_with_templatefield_code)
frontend_node = build_custom_component_template(component)
assert isinstance(frontend_node, dict)
template = frontend_node["template"]

View file

@ -3,7 +3,7 @@ import json
import pytest
from langchain.chains.base import Chain
from langflow.graph import Graph
from langflow.processing.process import load_flow_from_json
from langflow.processing.load import load_flow_from_json
from langflow.utils.payload import get_root_vertex