+ You can get the HTML code embedded with the chat by clicking the Code button
+ at the Sidebar after building a flow.
+
+
+{" "}
+
+
+ Clicking the Chat Widget HTML tab, you'll get the code to be inserted. Read
+ below to learn how to use it with HTML, React and Angular.
+
+
+{" "}
+
+
Further down, we will explain each of these options.
@@ -34,9 +32,10 @@ import ReactPlayer from "react-player";
Flows can be exported and imported as JSON files.
-:::caution
+
Watch out for API keys being stored in local files.
-:::
+
+
---
diff --git a/docs/docs/guidelines/prompt-customization.mdx b/docs/docs/guidelines/prompt-customization.mdx
index 8e2f409f9..efb5b3928 100644
--- a/docs/docs/guidelines/prompt-customization.mdx
+++ b/docs/docs/guidelines/prompt-customization.mdx
@@ -7,80 +7,62 @@ import ReactPlayer from "react-player";
The prompt template allows users to create prompts and define variables that provide control over instructing the model.
-
-
-
+{" "}
+
Variables can be used to define instructions, questions, context, inputs, or examples for the model and can be created with any chosen name in curly brackets, e.g., `{variable_name}`. They act as placeholders for parts of the text that can be easily modified.
-
-
-
+{" "}
+
Once inserted, these variables are immediately recognized as new fields in the prompt component. Here, you can define their values within the component itself or leave a field empty to be adjusted over the chat interface.
-
-
-
+{" "}
+
+
You can also use documents or output parsers as prompt variables. By plugging them into prompt handles, theyโll disable and feed that input field.
-
-
-
-
+{" "}
+
With this, users can interact with documents, webpages, or any other type of content directly from the prompt, which allows for seamless integration of external resources with the language model.
-
-
If working with an interactive (chat-like) flow, remember to keep one of the input variables empty to behave as the chat input.
-
-
-
-
+{" "}
+
diff --git a/docs/docs/guides/chatprompttemplate_guide.mdx b/docs/docs/guides/chatprompttemplate_guide.mdx
index 422bb6420..05a8f3333 100644
--- a/docs/docs/guides/chatprompttemplate_guide.mdx
+++ b/docs/docs/guides/chatprompttemplate_guide.mdx
@@ -39,8 +39,7 @@ In this guide, we will modify the "Basic Chat with Prompt and History" example,
5. Open the "Prompt" field on the SystemMessagePromptTemplate component.
-6. Enter the text: `You are a {role} that {behavior}.`
-
+6. Enter the text: _`You are a {role} that {behavior}.`_
7. Save your changes by clicking on "Check & Save".
8. Define the 'role' variable by typing "obedient assistant".
diff --git a/docs/docs/guides/loading_document.mdx b/docs/docs/guides/loading_document.mdx
index d760e9124..73fb85968 100644
--- a/docs/docs/guides/loading_document.mdx
+++ b/docs/docs/guides/loading_document.mdx
@@ -43,7 +43,7 @@ This guide takes you through the process of augmenting the "Basic Chat with Prom
8. Connect this loader to the `{context}` variable that we just added.
-9. In the "Web Page" field, enter "https://langflow.org/how-upload-examples".
+9. In the "Web Page" field, enter "https://docs.langflow.org/how-upload-examples".
10. Now, click on "ConversationBufferMemory".
diff --git a/docs/docs/index.mdx b/docs/docs/index.mdx
index 4ec4a300d..7be04549c 100644
--- a/docs/docs/index.mdx
+++ b/docs/docs/index.mdx
@@ -6,13 +6,11 @@ import ThemedImage from "@theme/ThemedImage";
import useBaseUrl from "@docusaurus/useBaseUrl";
import ZoomableImage from "/src/theme/ZoomableImage.js";
-
-
-
+{" "}
+
diff --git a/docs/docusaurus.config.js b/docs/docusaurus.config.js
index 3ac152b5b..617aec3d0 100644
--- a/docs/docusaurus.config.js
+++ b/docs/docusaurus.config.js
@@ -1,127 +1,145 @@
const lightCodeTheme = require("prism-react-renderer/themes/github");
+const { remarkCodeHike } = require("@code-hike/mdx");
// With JSDoc @type annotations, IDEs can provide config autocompletion
/** @type {import('@docusaurus/types').DocusaurusConfig} */
-(
- module.exports = {
- title: "Langflow Documentation",
- tagline: "Langflow is a GUI for LangChain, designed with react-flow",
- favicon: "img/favicon.ico",
- url: "https://logspace-ai.github.io",
- baseUrl: "/",
- onBrokenLinks: "throw",
- onBrokenMarkdownLinks: "warn",
- organizationName: "logspace-ai",
- projectName: "langflow",
- trailingSlash: false,
- customFields: {
- mendableAnonKey: process.env.MENDABLE_ANON_KEY,
- },
- i18n: {
- defaultLocale: "en",
- locales: ["en"],
- },
- presets: [
- [
- "@docusaurus/preset-classic",
- /** @type {import('@docusaurus/preset-classic').Options} */
- ({
- docs: {
- routeBasePath: "/",
- sidebarPath: require.resolve("./sidebars.js"),
- path: "docs",
- // sidebarPath: 'sidebars.js',
- },
- theme: {
- customCss: require.resolve("./src/css/custom.css"),
- },
- }),
- ],
- ],
- plugins: [
- ["docusaurus-node-polyfills", { excludeAliases: ["console"] }],
- "docusaurus-plugin-image-zoom",
- // ....
- async function myPlugin(context, options) {
- return {
- name: "docusaurus-tailwindcss",
- configurePostCss(postcssOptions) {
- // Appends TailwindCSS and AutoPrefixer.
- postcssOptions.plugins.push(require("tailwindcss"));
- postcssOptions.plugins.push(require("autoprefixer"));
- return postcssOptions;
- },
- };
- },
- ],
- themeConfig:
- /** @type {import('@docusaurus/preset-classic').ThemeConfig} */
+module.exports = {
+ title: "Langflow Documentation",
+ tagline: "Langflow is a GUI for LangChain, designed with react-flow",
+ favicon: "img/favicon.ico",
+ url: "https://logspace-ai.github.io",
+ baseUrl: "/",
+ onBrokenLinks: "throw",
+ onBrokenMarkdownLinks: "warn",
+ organizationName: "logspace-ai",
+ projectName: "langflow",
+ trailingSlash: false,
+ customFields: {
+ mendableAnonKey: process.env.MENDABLE_ANON_KEY,
+ },
+ i18n: {
+ defaultLocale: "en",
+ locales: ["en"],
+ },
+ presets: [
+ [
+ "@docusaurus/preset-classic",
+ /** @type {import('@docusaurus/preset-classic').Options} */
({
- navbar: {
- hideOnScroll: true,
- title: "Langflow",
- logo: {
- alt: "Langflow",
- src: "img/chain.png",
- },
- items: [
- // right
- {
- position: "right",
- href: "https://github.com/logspace-ai/langflow",
- position: "right",
- className: "header-github-link",
- target: "_blank",
- rel: null,
- },
- {
- position: "right",
- href: "https://twitter.com/logspace_ai",
- position: "right",
- className: "header-twitter-link",
- target: "_blank",
- rel: null,
- },
- {
- position: "right",
- href: "https://discord.gg/EqksyE2EX9",
- position: "right",
- className: "header-discord-link",
- target: "_blank",
- rel: null,
- },
+ docs: {
+ beforeDefaultRemarkPlugins: [
+ [
+ remarkCodeHike,
+ {
+ theme: "github-light",
+ showCopyButton: true,
+ lineNumbers: true,
+ },
+ ],
+ ],
+ routeBasePath: "/",
+ sidebarPath: require.resolve("./sidebars.js"),
+ path: "docs",
+ // sidebarPath: 'sidebars.js',
+ },
+ theme: {
+ customCss: [
+ require.resolve("@code-hike/mdx/styles.css"),
+ require.resolve("./src/css/custom.css"),
],
},
- tableOfContents: {
- minHeadingLevel: 2,
- maxHeadingLevel: 5,
- },
- colorMode: {
- defaultMode: "light",
- disableSwitch: true,
- respectPrefersColorScheme: false,
- },
- announcementBar: {
- content:
- 'โญ๏ธ If you like โ๏ธLangflow, star it on
GitHub ! โญ๏ธ',
- backgroundColor: "#B53D38", //Mustard Yellow #D19900 #D4B20B - Salmon #E9967A
- textColor: "#fff",
- isCloseable: false,
- },
- footer: {
- links: [],
- copyright: `Copyright ยฉ ${new Date().getFullYear()} Logspace.`,
- },
- zoom: {
- selector: ".markdown :not(a) > img:not(.no-zoom)",
- background: {
- light: "rgba(240, 240, 240, 0.9)",
- },
- config: {},
- },
- prism: {
- theme: lightCodeTheme,
- },
}),
- }
-);
+ ],
+ ],
+ plugins: [
+ ["docusaurus-node-polyfills", { excludeAliases: ["console"] }],
+ "docusaurus-plugin-image-zoom",
+ // ....
+ async function myPlugin(context, options) {
+ return {
+ name: "docusaurus-tailwindcss",
+ configurePostCss(postcssOptions) {
+ // Appends TailwindCSS and AutoPrefixer.
+ postcssOptions.plugins.push(require("tailwindcss"));
+ postcssOptions.plugins.push(require("autoprefixer"));
+ return postcssOptions;
+ },
+ };
+ },
+ ],
+ themes: ["mdx-v2"],
+ themeConfig:
+ /** @type {import('@docusaurus/preset-classic').ThemeConfig} */
+ ({
+ navbar: {
+ hideOnScroll: true,
+ title: "Langflow",
+ logo: {
+ alt: "Langflow",
+ src: "img/chain.png",
+ },
+ items: [
+ // right
+ {
+ position: "right",
+ href: "https://github.com/logspace-ai/langflow",
+ position: "right",
+ className: "header-github-link",
+ target: "_blank",
+ rel: null,
+ },
+ {
+ position: "right",
+ href: "https://twitter.com/logspace_ai",
+ position: "right",
+ className: "header-twitter-link",
+ target: "_blank",
+ rel: null,
+ },
+ {
+ position: "right",
+ href: "https://discord.gg/EqksyE2EX9",
+ position: "right",
+ className: "header-discord-link",
+ target: "_blank",
+ rel: null,
+ },
+ ],
+ },
+ tableOfContents: {
+ minHeadingLevel: 2,
+ maxHeadingLevel: 5,
+ },
+ colorMode: {
+ defaultMode: "light",
+ disableSwitch: true,
+ respectPrefersColorScheme: false,
+ },
+ announcementBar: {
+ content:
+ 'โญ๏ธ If you like โ๏ธLangflow, star it on
GitHub ! โญ๏ธ',
+ backgroundColor: "#E8EBF1", //Mustard Yellow #D19900 #D4B20B - Salmon #E9967A
+ textColor: "#1C1E21",
+ isCloseable: false,
+ },
+ footer: {
+ links: [],
+ copyright: `Copyright ยฉ ${new Date().getFullYear()} Logspace.`,
+ },
+ zoom: {
+ selector: ".markdown :not(a) > img:not(.no-zoom)",
+ background: {
+ light: "rgba(240, 240, 240, 0.9)",
+ },
+ config: {},
+ },
+ // prism: {
+ // theme: require("prism-react-renderer/themes/dracula"),
+ // },
+ docs: {
+ sidebar: {
+ hideable: true,
+ },
+ },
+ }),
+};
diff --git a/docs/package-lock.json b/docs/package-lock.json
index 7db7f9376..ed79230c6 100644
--- a/docs/package-lock.json
+++ b/docs/package-lock.json
@@ -9,12 +9,13 @@
"version": "0.0.0",
"dependencies": {
"@babel/preset-react": "^7.22.3",
+ "@code-hike/mdx": "^0.9.0",
"@docusaurus/core": "2.4.1",
"@docusaurus/plugin-ideal-image": "^2.4.1",
"@docusaurus/preset-classic": "2.4.1",
"@docusaurus/theme-classic": "^2.4.1",
"@docusaurus/theme-search-algolia": "^2.4.1",
- "@mdx-js/react": "^1.6.22",
+ "@mdx-js/react": "^2.3.0",
"@mendable/search": "^0.0.114",
"@pbe/react-yandex-maps": "^1.2.4",
"@prismicio/client": "^7.0.1",
@@ -22,6 +23,7 @@
"autoprefixer": "^10.4.14",
"clsx": "^1.2.1",
"docusaurus-plugin-image-zoom": "^0.1.4",
+ "docusaurus-theme-mdx-v2": "^0.1.2",
"jquery": "^3.7.0",
"medium-zoom": "^1.0.8",
"node-fetch": "^3.3.1",
@@ -1986,6 +1988,49 @@
"node": ">=6.9.0"
}
},
+ "node_modules/@code-hike/lighter": {
+ "version": "0.7.0",
+ "resolved": "https://registry.npmjs.org/@code-hike/lighter/-/lighter-0.7.0.tgz",
+ "integrity": "sha512-64O07rIORKQLB+5T/GKAmKcD9sC0N9yHFJXa0Hs+0Aee1G+I4bSXxTccuDFP6c/G/3h5Pk7yv7PoX9/SpzaeiQ==",
+ "funding": {
+ "url": "https://github.com/code-hike/lighter?sponsor=1"
+ }
+ },
+ "node_modules/@code-hike/mdx": {
+ "version": "0.9.0",
+ "resolved": "https://registry.npmjs.org/@code-hike/mdx/-/mdx-0.9.0.tgz",
+ "integrity": "sha512-0wg68ZCjVWAkWT4gBUZJ8Mwktjen/XeWyqBQCrhA2IZSbZZnMYsEI6JJEFb/nZoNI3comB3JdxPLykZRq3qT2A==",
+ "dependencies": {
+ "@code-hike/lighter": "0.7.0",
+ "node-fetch": "^2.0.0"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/code-hike"
+ },
+ "peerDependencies": {
+ "react": "^16.8.3 || ^17 || ^18"
+ }
+ },
+ "node_modules/@code-hike/mdx/node_modules/node-fetch": {
+ "version": "2.6.12",
+ "resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-2.6.12.tgz",
+ "integrity": "sha512-C/fGU2E8ToujUivIO0H+tpQ6HWo4eEmchoPIoXtxCrVghxdKq+QOHqEZW7tuP3KlV3bC8FRMO5nMCC7Zm1VP6g==",
+ "dependencies": {
+ "whatwg-url": "^5.0.0"
+ },
+ "engines": {
+ "node": "4.x || >=6.0.0"
+ },
+ "peerDependencies": {
+ "encoding": "^0.1.0"
+ },
+ "peerDependenciesMeta": {
+ "encoding": {
+ "optional": true
+ }
+ }
+ },
"node_modules/@colors/colors": {
"version": "1.5.0",
"resolved": "https://registry.npmjs.org/@colors/colors/-/colors-1.5.0.tgz",
@@ -2683,6 +2728,18 @@
"react-dom": "^16.8.4 || ^17.0.0"
}
},
+ "node_modules/@docusaurus/theme-classic/node_modules/@mdx-js/react": {
+ "version": "1.6.22",
+ "resolved": "https://registry.npmjs.org/@mdx-js/react/-/react-1.6.22.tgz",
+ "integrity": "sha512-TDoPum4SHdfPiGSAaRBw7ECyI8VaHpK8GJugbJIJuqyh6kzw9ZLJZW3HGL3NNrJGxcAixUvqROm+YuQOo5eXtg==",
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ },
+ "peerDependencies": {
+ "react": "^16.13.1 || ^17.0.0"
+ }
+ },
"node_modules/@docusaurus/theme-common": {
"version": "2.4.1",
"resolved": "https://registry.npmjs.org/@docusaurus/theme-common/-/theme-common-2.4.1.tgz",
@@ -3168,15 +3225,19 @@
}
},
"node_modules/@mdx-js/react": {
- "version": "1.6.22",
- "resolved": "https://registry.npmjs.org/@mdx-js/react/-/react-1.6.22.tgz",
- "integrity": "sha512-TDoPum4SHdfPiGSAaRBw7ECyI8VaHpK8GJugbJIJuqyh6kzw9ZLJZW3HGL3NNrJGxcAixUvqROm+YuQOo5eXtg==",
+ "version": "2.3.0",
+ "resolved": "https://registry.npmjs.org/@mdx-js/react/-/react-2.3.0.tgz",
+ "integrity": "sha512-zQH//gdOmuu7nt2oJR29vFhDv88oGPmVw6BggmrHeMI+xgEkp1B2dX9/bMBSYtK0dyLX/aOmesKS09g222K1/g==",
+ "dependencies": {
+ "@types/mdx": "^2.0.0",
+ "@types/react": ">=16"
+ },
"funding": {
"type": "opencollective",
"url": "https://opencollective.com/unified"
},
"peerDependencies": {
- "react": "^16.13.1 || ^17.0.0"
+ "react": ">=16"
}
},
"node_modules/@mdx-js/util": {
@@ -3665,6 +3726,14 @@
"node": ">=10.13.0"
}
},
+ "node_modules/@types/acorn": {
+ "version": "4.0.6",
+ "resolved": "https://registry.npmjs.org/@types/acorn/-/acorn-4.0.6.tgz",
+ "integrity": "sha512-veQTnWP+1D/xbxVrPC3zHnCZRjSrKfhbMUlEA43iMZLu7EsnTtkJklIuwrCPbOi8YkvDQAiW05VQQFvvz9oieQ==",
+ "dependencies": {
+ "@types/estree": "*"
+ }
+ },
"node_modules/@types/body-parser": {
"version": "1.19.2",
"resolved": "https://registry.npmjs.org/@types/body-parser/-/body-parser-1.19.2.tgz",
@@ -3730,6 +3799,14 @@
"resolved": "https://registry.npmjs.org/@types/estree/-/estree-1.0.1.tgz",
"integrity": "sha512-LG4opVs2ANWZ1TJoKc937iMmNstM/d0ae1vNbnBvBhqCSezgVUOzcLCqbI5elV8Vy6WKwKjaqR+zO9VKirBBCA=="
},
+ "node_modules/@types/estree-jsx": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/@types/estree-jsx/-/estree-jsx-1.0.0.tgz",
+ "integrity": "sha512-3qvGd0z8F2ENTGr/GG1yViqfiKmRfrXVx5sJyHGFu3z7m5g5utCQtGp/g29JnjflhtQJBv1WDQukHiT58xPcYQ==",
+ "dependencies": {
+ "@types/estree": "*"
+ }
+ },
"node_modules/@types/express": {
"version": "4.17.17",
"resolved": "https://registry.npmjs.org/@types/express/-/express-4.17.17.tgz",
@@ -3817,6 +3894,11 @@
"@types/unist": "^2"
}
},
+ "node_modules/@types/mdx": {
+ "version": "2.0.5",
+ "resolved": "https://registry.npmjs.org/@types/mdx/-/mdx-2.0.5.tgz",
+ "integrity": "sha512-76CqzuD6Q7LC+AtbPqrvD9AqsN0k8bsYo2bM2J8pmNldP1aIPAbzUQ7QbobyXL4eLr1wK5x8FZFe8eF/ubRuBg=="
+ },
"node_modules/@types/mime": {
"version": "1.3.2",
"resolved": "https://registry.npmjs.org/@types/mime/-/mime-1.3.2.tgz",
@@ -4198,6 +4280,14 @@
"acorn": "^8"
}
},
+ "node_modules/acorn-jsx": {
+ "version": "5.3.2",
+ "resolved": "https://registry.npmjs.org/acorn-jsx/-/acorn-jsx-5.3.2.tgz",
+ "integrity": "sha512-rq9s+JNhf0IChjtDXxllJ7g41oZk5SlXtp0LHwyA5cejwn7vKmKp4pPri6YEePv2PU65sAsegbXtIinmDFDXgQ==",
+ "peerDependencies": {
+ "acorn": "^6.0.0 || ^7.0.0 || ^8.0.0"
+ }
+ },
"node_modules/acorn-walk": {
"version": "8.2.0",
"resolved": "https://registry.npmjs.org/acorn-walk/-/acorn-walk-8.2.0.tgz",
@@ -4502,6 +4592,14 @@
"util": "^0.12.0"
}
},
+ "node_modules/astring": {
+ "version": "1.8.6",
+ "resolved": "https://registry.npmjs.org/astring/-/astring-1.8.6.tgz",
+ "integrity": "sha512-ISvCdHdlTDlH5IpxQJIex7BWBywFWgjJSVdwst+/iQCoEYnyOaQ95+X1JGshuBjGp6nxKUy1jMgE3zPqN7fQdg==",
+ "bin": {
+ "astring": "bin/astring"
+ }
+ },
"node_modules/async-foreach": {
"version": "0.1.3",
"resolved": "https://registry.npmjs.org/async-foreach/-/async-foreach-0.1.3.tgz",
@@ -5391,6 +5489,15 @@
"url": "https://github.com/sponsors/wooorm"
}
},
+ "node_modules/character-entities-html4": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/character-entities-html4/-/character-entities-html4-2.1.0.tgz",
+ "integrity": "sha512-1v7fgQRj6hnSwFpq1Eu0ynr/CDEw0rXo2B61qXrLNdHZmPKgb7fqS1a2JwF0rISo9q77jDI8VMEHoApn8qDoZA==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
"node_modules/character-entities-legacy": {
"version": "1.1.4",
"resolved": "https://registry.npmjs.org/character-entities-legacy/-/character-entities-legacy-1.1.4.tgz",
@@ -6884,6 +6991,296 @@
"node": ">=6"
}
},
+ "node_modules/docusaurus-mdx-loader-v2": {
+ "version": "0.1.2",
+ "resolved": "https://registry.npmjs.org/docusaurus-mdx-loader-v2/-/docusaurus-mdx-loader-v2-0.1.2.tgz",
+ "integrity": "sha512-Dd/XieCKKoirnJDou4h33zRZPCmbtSqvXrZm0yMmhCpLDpeScu8CBvveFVHCqs7UB+x82IpzgZX5rHkoFlz2Bw==",
+ "dependencies": {
+ "@babel/parser": "^7.17.3",
+ "@babel/traverse": "^7.17.3",
+ "@docusaurus/logger": "2.0.0-beta.18",
+ "@docusaurus/utils": "2.0.0-beta.18",
+ "@mdx-js/mdx": "^2.1.0",
+ "escape-html": "^1.0.3",
+ "estree-util-value-to-estree": "^1.3.0",
+ "file-loader": "^6.2.0",
+ "fs-extra": "^10.0.1",
+ "image-size": "^1.0.1",
+ "lz-string": "^1.4.4",
+ "mdast-util-to-string": "^2.0.0",
+ "remark-admonitions": "^1.2.1",
+ "remark-emoji": "^2.1.0",
+ "remark-gfm": "1.0.0",
+ "stringify-object": "^3.3.0",
+ "tslib": "^2.3.1",
+ "unist-util-visit": "^2.0.2",
+ "url-loader": "^4.1.1",
+ "webpack": "^5.69.1"
+ },
+ "engines": {
+ "node": ">=14"
+ },
+ "peerDependencies": {
+ "react": "^16.8.4 || ^17.0.0",
+ "react-dom": "^16.8.4 || ^17.0.0"
+ }
+ },
+ "node_modules/docusaurus-mdx-loader-v2/node_modules/@docusaurus/logger": {
+ "version": "2.0.0-beta.18",
+ "resolved": "https://registry.npmjs.org/@docusaurus/logger/-/logger-2.0.0-beta.18.tgz",
+ "integrity": "sha512-frNe5vhH3mbPmH980Lvzaz45+n1PQl3TkslzWYXQeJOkFX17zUd3e3U7F9kR1+DocmAqHkgAoWuXVcvEoN29fg==",
+ "dependencies": {
+ "chalk": "^4.1.2",
+ "tslib": "^2.3.1"
+ },
+ "engines": {
+ "node": ">=14"
+ }
+ },
+ "node_modules/docusaurus-mdx-loader-v2/node_modules/@docusaurus/utils": {
+ "version": "2.0.0-beta.18",
+ "resolved": "https://registry.npmjs.org/@docusaurus/utils/-/utils-2.0.0-beta.18.tgz",
+ "integrity": "sha512-v2vBmH7xSbPwx3+GB90HgLSQdj+Rh5ELtZWy7M20w907k0ROzDmPQ/8Ke2DK3o5r4pZPGnCrsB3SaYI83AEmAA==",
+ "dependencies": {
+ "@docusaurus/logger": "2.0.0-beta.18",
+ "@svgr/webpack": "^6.2.1",
+ "file-loader": "^6.2.0",
+ "fs-extra": "^10.0.1",
+ "github-slugger": "^1.4.0",
+ "globby": "^11.1.0",
+ "gray-matter": "^4.0.3",
+ "js-yaml": "^4.1.0",
+ "lodash": "^4.17.21",
+ "micromatch": "^4.0.5",
+ "resolve-pathname": "^3.0.0",
+ "shelljs": "^0.8.5",
+ "tslib": "^2.3.1",
+ "url-loader": "^4.1.1",
+ "webpack": "^5.70.0"
+ },
+ "engines": {
+ "node": ">=14"
+ }
+ },
+ "node_modules/docusaurus-mdx-loader-v2/node_modules/@mdx-js/mdx": {
+ "version": "2.3.0",
+ "resolved": "https://registry.npmjs.org/@mdx-js/mdx/-/mdx-2.3.0.tgz",
+ "integrity": "sha512-jLuwRlz8DQfQNiUCJR50Y09CGPq3fLtmtUQfVrj79E0JWu3dvsVcxVIcfhR5h0iXu+/z++zDrYeiJqifRynJkA==",
+ "dependencies": {
+ "@types/estree-jsx": "^1.0.0",
+ "@types/mdx": "^2.0.0",
+ "estree-util-build-jsx": "^2.0.0",
+ "estree-util-is-identifier-name": "^2.0.0",
+ "estree-util-to-js": "^1.1.0",
+ "estree-walker": "^3.0.0",
+ "hast-util-to-estree": "^2.0.0",
+ "markdown-extensions": "^1.0.0",
+ "periscopic": "^3.0.0",
+ "remark-mdx": "^2.0.0",
+ "remark-parse": "^10.0.0",
+ "remark-rehype": "^10.0.0",
+ "unified": "^10.0.0",
+ "unist-util-position-from-estree": "^1.0.0",
+ "unist-util-stringify-position": "^3.0.0",
+ "unist-util-visit": "^4.0.0",
+ "vfile": "^5.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/docusaurus-mdx-loader-v2/node_modules/@mdx-js/mdx/node_modules/unist-util-visit": {
+ "version": "4.1.2",
+ "resolved": "https://registry.npmjs.org/unist-util-visit/-/unist-util-visit-4.1.2.tgz",
+ "integrity": "sha512-MSd8OUGISqHdVvfY9TPhyK2VdUrPgxkUtWSuMHF6XAAFuL4LokseigBnZtPnJMu+FbynTkFNnFlyjxpVKujMRg==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-is": "^5.0.0",
+ "unist-util-visit-parents": "^5.1.1"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/docusaurus-mdx-loader-v2/node_modules/ansi-styles": {
+ "version": "4.3.0",
+ "resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.3.0.tgz",
+ "integrity": "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==",
+ "dependencies": {
+ "color-convert": "^2.0.1"
+ },
+ "engines": {
+ "node": ">=8"
+ },
+ "funding": {
+ "url": "https://github.com/chalk/ansi-styles?sponsor=1"
+ }
+ },
+ "node_modules/docusaurus-mdx-loader-v2/node_modules/bail": {
+ "version": "2.0.2",
+ "resolved": "https://registry.npmjs.org/bail/-/bail-2.0.2.tgz",
+ "integrity": "sha512-0xO6mYd7JB2YesxDKplafRpsiOzPt9V02ddPCLbY1xYGPOX24NTyN50qnUxgCPcSoYMhKpAuBTjQoRZCAkUDRw==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/docusaurus-mdx-loader-v2/node_modules/chalk": {
+ "version": "4.1.2",
+ "resolved": "https://registry.npmjs.org/chalk/-/chalk-4.1.2.tgz",
+ "integrity": "sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==",
+ "dependencies": {
+ "ansi-styles": "^4.1.0",
+ "supports-color": "^7.1.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/chalk/chalk?sponsor=1"
+ }
+ },
+ "node_modules/docusaurus-mdx-loader-v2/node_modules/color-convert": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
+ "integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==",
+ "dependencies": {
+ "color-name": "~1.1.4"
+ },
+ "engines": {
+ "node": ">=7.0.0"
+ }
+ },
+ "node_modules/docusaurus-mdx-loader-v2/node_modules/color-name": {
+ "version": "1.1.4",
+ "resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz",
+ "integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA=="
+ },
+ "node_modules/docusaurus-mdx-loader-v2/node_modules/has-flag": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/has-flag/-/has-flag-4.0.0.tgz",
+ "integrity": "sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/docusaurus-mdx-loader-v2/node_modules/is-plain-obj": {
+ "version": "4.1.0",
+ "resolved": "https://registry.npmjs.org/is-plain-obj/-/is-plain-obj-4.1.0.tgz",
+ "integrity": "sha512-+Pgi+vMuUNkJyExiMBt5IlFoMyKnr5zhJ4Uspz58WOhBF5QoIZkFyNHIbBAtHwzVAgk5RtndVNsDRN61/mmDqg==",
+ "engines": {
+ "node": ">=12"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/docusaurus-mdx-loader-v2/node_modules/remark-mdx": {
+ "version": "2.3.0",
+ "resolved": "https://registry.npmjs.org/remark-mdx/-/remark-mdx-2.3.0.tgz",
+ "integrity": "sha512-g53hMkpM0I98MU266IzDFMrTD980gNF3BJnkyFcmN+dD873mQeD5rdMO3Y2X+x8umQfbSE0PcoEDl7ledSA+2g==",
+ "dependencies": {
+ "mdast-util-mdx": "^2.0.0",
+ "micromark-extension-mdxjs": "^1.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/docusaurus-mdx-loader-v2/node_modules/supports-color": {
+ "version": "7.2.0",
+ "resolved": "https://registry.npmjs.org/supports-color/-/supports-color-7.2.0.tgz",
+ "integrity": "sha512-qpCAvRl9stuOHveKsn7HncJRvv501qIacKzQlO/+Lwxc9+0q2wLyv4Dfvt80/DPn2pqOBsJdDiogXGR9+OvwRw==",
+ "dependencies": {
+ "has-flag": "^4.0.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/docusaurus-mdx-loader-v2/node_modules/trough": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/trough/-/trough-2.1.0.tgz",
+ "integrity": "sha512-AqTiAOLcj85xS7vQ8QkAV41hPDIJ71XJB4RCUrzo/1GM2CQwhkJGaf9Hgr7BOugMRpgGUrqRg/DrBDl4H40+8g==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/docusaurus-mdx-loader-v2/node_modules/unified": {
+ "version": "10.1.2",
+ "resolved": "https://registry.npmjs.org/unified/-/unified-10.1.2.tgz",
+ "integrity": "sha512-pUSWAi/RAnVy1Pif2kAoeWNBa3JVrx0MId2LASj8G+7AiHWoKZNTomq6LG326T68U7/e263X6fTdcXIy7XnF7Q==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "bail": "^2.0.0",
+ "extend": "^3.0.0",
+ "is-buffer": "^2.0.0",
+ "is-plain-obj": "^4.0.0",
+ "trough": "^2.0.0",
+ "vfile": "^5.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/docusaurus-mdx-loader-v2/node_modules/unist-util-is": {
+ "version": "5.2.1",
+ "resolved": "https://registry.npmjs.org/unist-util-is/-/unist-util-is-5.2.1.tgz",
+ "integrity": "sha512-u9njyyfEh43npf1M+yGKDGVPbY/JWEemg5nH05ncKPfi+kBbKBJoTdsogMu33uhytuLlv9y0O7GH7fEdwLdLQw==",
+ "dependencies": {
+ "@types/unist": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/docusaurus-mdx-loader-v2/node_modules/unist-util-visit-parents": {
+ "version": "5.1.3",
+ "resolved": "https://registry.npmjs.org/unist-util-visit-parents/-/unist-util-visit-parents-5.1.3.tgz",
+ "integrity": "sha512-x6+y8g7wWMyQhL1iZfhIPhDAs7Xwbn9nRosDXl7qoPTSCy0yNxnKc+hWokFifWQIDGi154rdUqKvbCa4+1kLhg==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-is": "^5.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/docusaurus-mdx-loader-v2/node_modules/vfile": {
+ "version": "5.3.7",
+ "resolved": "https://registry.npmjs.org/vfile/-/vfile-5.3.7.tgz",
+ "integrity": "sha512-r7qlzkgErKjobAmyNIkkSpizsFPYiUPuJb5pNW1RB4JcYVZhs4lIbVqk8XPk033CV/1z8ss5pkax8SuhGpcG8g==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "is-buffer": "^2.0.0",
+ "unist-util-stringify-position": "^3.0.0",
+ "vfile-message": "^3.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/docusaurus-mdx-loader-v2/node_modules/vfile-message": {
+ "version": "3.1.4",
+ "resolved": "https://registry.npmjs.org/vfile-message/-/vfile-message-3.1.4.tgz",
+ "integrity": "sha512-fa0Z6P8HUrQN4BZaX05SIVXic+7kE3b05PWAtPuYP9QLHsLKYR7/AlLW3NtOrpXRLeawpDLMsVkmk5DG0NXgWw==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-stringify-position": "^3.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
"node_modules/docusaurus-node-polyfills": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/docusaurus-node-polyfills/-/docusaurus-node-polyfills-1.0.0.tgz",
@@ -6906,6 +7303,18 @@
"medium-zoom": "^1.0.6"
}
},
+ "node_modules/docusaurus-theme-mdx-v2": {
+ "version": "0.1.2",
+ "resolved": "https://registry.npmjs.org/docusaurus-theme-mdx-v2/-/docusaurus-theme-mdx-v2-0.1.2.tgz",
+ "integrity": "sha512-n5L4nx0LV5coTkZYS+owXmM0ACXWCbd4ou7aDrWIMm3YH7XPusSNelJpYsUKJxHFER/+czitbmieboFe4I7lMQ==",
+ "dependencies": {
+ "@mdx-js/react": "^2.1.0",
+ "docusaurus-mdx-loader-v2": "0.1.2"
+ },
+ "engines": {
+ "node": ">=14"
+ }
+ },
"node_modules/dom-converter": {
"version": "0.2.0",
"resolved": "https://registry.npmjs.org/dom-converter/-/dom-converter-0.2.0.tgz",
@@ -7247,6 +7656,106 @@
"node": ">=4.0"
}
},
+ "node_modules/estree-util-attach-comments": {
+ "version": "2.1.1",
+ "resolved": "https://registry.npmjs.org/estree-util-attach-comments/-/estree-util-attach-comments-2.1.1.tgz",
+ "integrity": "sha512-+5Ba/xGGS6mnwFbXIuQiDPTbuTxuMCooq3arVv7gPZtYpjp+VXH/NkHAP35OOefPhNG/UGqU3vt/LTABwcHX0w==",
+ "dependencies": {
+ "@types/estree": "^1.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/estree-util-build-jsx": {
+ "version": "2.2.2",
+ "resolved": "https://registry.npmjs.org/estree-util-build-jsx/-/estree-util-build-jsx-2.2.2.tgz",
+ "integrity": "sha512-m56vOXcOBuaF+Igpb9OPAy7f9w9OIkb5yhjsZuaPm7HoGi4oTOQi0h2+yZ+AtKklYFZ+rPC4n0wYCJCEU1ONqg==",
+ "dependencies": {
+ "@types/estree-jsx": "^1.0.0",
+ "estree-util-is-identifier-name": "^2.0.0",
+ "estree-walker": "^3.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/estree-util-is-identifier-name": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/estree-util-is-identifier-name/-/estree-util-is-identifier-name-2.1.0.tgz",
+ "integrity": "sha512-bEN9VHRyXAUOjkKVQVvArFym08BTWB0aJPppZZr0UNyAqWsLaVfAqP7hbaTJjzHifmB5ebnR8Wm7r7yGN/HonQ==",
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/estree-util-to-js": {
+ "version": "1.2.0",
+ "resolved": "https://registry.npmjs.org/estree-util-to-js/-/estree-util-to-js-1.2.0.tgz",
+ "integrity": "sha512-IzU74r1PK5IMMGZXUVZbmiu4A1uhiPgW5hm1GjcOfr4ZzHaMPpLNJjR7HjXiIOzi25nZDrgFTobHTkV5Q6ITjA==",
+ "dependencies": {
+ "@types/estree-jsx": "^1.0.0",
+ "astring": "^1.8.0",
+ "source-map": "^0.7.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/estree-util-to-js/node_modules/source-map": {
+ "version": "0.7.4",
+ "resolved": "https://registry.npmjs.org/source-map/-/source-map-0.7.4.tgz",
+ "integrity": "sha512-l3BikUxvPOcn5E74dZiq5BGsTb5yEwhaTSzccU6t4sDOH8NWJCstKO5QT2CvtFoK6F0saL7p9xHAqHOlCPJygA==",
+ "engines": {
+ "node": ">= 8"
+ }
+ },
+ "node_modules/estree-util-value-to-estree": {
+ "version": "1.3.0",
+ "resolved": "https://registry.npmjs.org/estree-util-value-to-estree/-/estree-util-value-to-estree-1.3.0.tgz",
+ "integrity": "sha512-Y+ughcF9jSUJvncXwqRageavjrNPAI+1M/L3BI3PyLp1nmgYTGUXU6t5z1Y7OWuThoDdhPME07bQU+d5LxdJqw==",
+ "dependencies": {
+ "is-plain-obj": "^3.0.0"
+ },
+ "engines": {
+ "node": ">=12.0.0"
+ }
+ },
+ "node_modules/estree-util-value-to-estree/node_modules/is-plain-obj": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/is-plain-obj/-/is-plain-obj-3.0.0.tgz",
+ "integrity": "sha512-gwsOE28k+23GP1B6vFl1oVh/WOzmawBrKwo5Ev6wMKzPkaXaCDIQKzLnvsA42DRlbVTWorkgTKIviAKCWkfUwA==",
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/estree-util-visit": {
+ "version": "1.2.1",
+ "resolved": "https://registry.npmjs.org/estree-util-visit/-/estree-util-visit-1.2.1.tgz",
+ "integrity": "sha512-xbgqcrkIVbIG+lI/gzbvd9SGTJL4zqJKBFttUl5pP27KhAjtMKbX/mQXJ7qgyXpMgVy/zvpm0xoQQaGL8OloOw==",
+ "dependencies": {
+ "@types/estree-jsx": "^1.0.0",
+ "@types/unist": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/estree-walker": {
+ "version": "3.0.3",
+ "resolved": "https://registry.npmjs.org/estree-walker/-/estree-walker-3.0.3.tgz",
+ "integrity": "sha512-7RUKfXgSMMkzt6ZuXmqapOurLGPPfgj6l9uRZ7lRGolvk0y2yocc35LdcxKC5PQZdn2DMqioAQ2NoWcrTKmm6g==",
+ "dependencies": {
+ "@types/estree": "^1.0.0"
+ }
+ },
"node_modules/esutils": {
"version": "2.0.3",
"resolved": "https://registry.npmjs.org/esutils/-/esutils-2.0.3.tgz",
@@ -8606,6 +9115,88 @@
"resolved": "https://registry.npmjs.org/parse5/-/parse5-6.0.1.tgz",
"integrity": "sha512-Ofn/CTFzRGTTxwpNEs9PP93gXShHcTq255nzRYSKe8AkVpZY7e1fpmTfOyoIvjP5HG7Z2ZM7VS9PPhQGW2pOpw=="
},
+ "node_modules/hast-util-to-estree": {
+ "version": "2.3.3",
+ "resolved": "https://registry.npmjs.org/hast-util-to-estree/-/hast-util-to-estree-2.3.3.tgz",
+ "integrity": "sha512-ihhPIUPxN0v0w6M5+IiAZZrn0LH2uZomeWwhn7uP7avZC6TE7lIiEh2yBMPr5+zi1aUCXq6VoYRgs2Bw9xmycQ==",
+ "dependencies": {
+ "@types/estree": "^1.0.0",
+ "@types/estree-jsx": "^1.0.0",
+ "@types/hast": "^2.0.0",
+ "@types/unist": "^2.0.0",
+ "comma-separated-tokens": "^2.0.0",
+ "estree-util-attach-comments": "^2.0.0",
+ "estree-util-is-identifier-name": "^2.0.0",
+ "hast-util-whitespace": "^2.0.0",
+ "mdast-util-mdx-expression": "^1.0.0",
+ "mdast-util-mdxjs-esm": "^1.0.0",
+ "property-information": "^6.0.0",
+ "space-separated-tokens": "^2.0.0",
+ "style-to-object": "^0.4.1",
+ "unist-util-position": "^4.0.0",
+ "zwitch": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/hast-util-to-estree/node_modules/comma-separated-tokens": {
+ "version": "2.0.3",
+ "resolved": "https://registry.npmjs.org/comma-separated-tokens/-/comma-separated-tokens-2.0.3.tgz",
+ "integrity": "sha512-Fu4hJdvzeylCfQPp9SGWidpzrMs7tTrlu6Vb8XGaRGck8QSNZJJp538Wrb60Lax4fPwR64ViY468OIUTbRlGZg==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/hast-util-to-estree/node_modules/property-information": {
+ "version": "6.2.0",
+ "resolved": "https://registry.npmjs.org/property-information/-/property-information-6.2.0.tgz",
+ "integrity": "sha512-kma4U7AFCTwpqq5twzC1YVIDXSqg6qQK6JN0smOw8fgRy1OkMi0CYSzFmsy6dnqSenamAtj0CyXMUJ1Mf6oROg==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/hast-util-to-estree/node_modules/space-separated-tokens": {
+ "version": "2.0.2",
+ "resolved": "https://registry.npmjs.org/space-separated-tokens/-/space-separated-tokens-2.0.2.tgz",
+ "integrity": "sha512-PEGlAwrG8yXGXRjW32fGbg66JAlOAwbObuqVoJpv/mRgoWDQfgH1wDPvtzWyUSNAXBGSk8h755YDbbcEy3SH2Q==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/hast-util-to-estree/node_modules/style-to-object": {
+ "version": "0.4.1",
+ "resolved": "https://registry.npmjs.org/style-to-object/-/style-to-object-0.4.1.tgz",
+ "integrity": "sha512-HFpbb5gr2ypci7Qw+IOhnP2zOU7e77b+rzM+wTzXzfi1PrtBCX0E7Pk4wL4iTLnhzZ+JgEGAhX81ebTg/aYjQw==",
+ "dependencies": {
+ "inline-style-parser": "0.1.1"
+ }
+ },
+ "node_modules/hast-util-to-estree/node_modules/unist-util-position": {
+ "version": "4.0.4",
+ "resolved": "https://registry.npmjs.org/unist-util-position/-/unist-util-position-4.0.4.tgz",
+ "integrity": "sha512-kUBE91efOWfIVBo8xzh/uZQ7p9ffYRtUbMRZBNFYwf0RK8koUMx6dGUfwylLOKmaT2cs4wSW96QoYUSXAyEtpg==",
+ "dependencies": {
+ "@types/unist": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/hast-util-to-estree/node_modules/zwitch": {
+ "version": "2.0.4",
+ "resolved": "https://registry.npmjs.org/zwitch/-/zwitch-2.0.4.tgz",
+ "integrity": "sha512-bXE4cR/kVZhKZX/RjPEflHaKVhUVl85noU3v6b8apfQEc1x4A+zBxjZ4lN8LqGd6WZ3dl98pY4o717VFmoPp+A==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
"node_modules/hast-util-to-parse5": {
"version": "6.0.0",
"resolved": "https://registry.npmjs.org/hast-util-to-parse5/-/hast-util-to-parse5-6.0.0.tgz",
@@ -8622,6 +9213,15 @@
"url": "https://opencollective.com/unified"
}
},
+ "node_modules/hast-util-whitespace": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/hast-util-whitespace/-/hast-util-whitespace-2.0.1.tgz",
+ "integrity": "sha512-nAxA0v8+vXSBDt3AnRUNjyRIQ0rD+ntpbAp4LnPkumc5M9yUbSMa4XDU9Q6etY4f1Wp4bNgvc1yjiZtsTTrSng==",
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
"node_modules/hastscript": {
"version": "6.0.0",
"resolved": "https://registry.npmjs.org/hastscript/-/hastscript-6.0.0.tgz",
@@ -9472,6 +10072,14 @@
"node": ">=0.10.0"
}
},
+ "node_modules/is-reference": {
+ "version": "3.0.1",
+ "resolved": "https://registry.npmjs.org/is-reference/-/is-reference-3.0.1.tgz",
+ "integrity": "sha512-baJJdQLiYaJdvFbJqXrcGv3WU3QCzBlUcI5QhbesIm6/xPsvmO+2CDoi/GMOFBQEQm+PXkwOPrp9KK5ozZsp2w==",
+ "dependencies": {
+ "@types/estree": "*"
+ }
+ },
"node_modules/is-regexp": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/is-regexp/-/is-regexp-1.0.0.tgz",
@@ -9952,6 +10560,15 @@
"resolved": "https://registry.npmjs.org/lodash.uniq/-/lodash.uniq-4.5.0.tgz",
"integrity": "sha512-xfBaXQd9ryd9dlSDvnvI0lvxfLJlYAZzXomUYzLKtUeOQvOP5piqAWuGtrhWeqaXK9hhoM/iyJc5AV+XfsX3HQ=="
},
+ "node_modules/longest-streak": {
+ "version": "2.0.4",
+ "resolved": "https://registry.npmjs.org/longest-streak/-/longest-streak-2.0.4.tgz",
+ "integrity": "sha512-vM6rUVCVUJJt33bnmHiZEvr7wPT78ztX7rojL+LW51bHtLh6HTjx84LA5W4+oa6aKEJA7jJu5LR6vQRBpA5DVg==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
"node_modules/loose-envify": {
"version": "1.4.0",
"resolved": "https://registry.npmjs.org/loose-envify/-/loose-envify-1.4.0.tgz",
@@ -9987,6 +10604,14 @@
"yallist": "^3.0.2"
}
},
+ "node_modules/lz-string": {
+ "version": "1.5.0",
+ "resolved": "https://registry.npmjs.org/lz-string/-/lz-string-1.5.0.tgz",
+ "integrity": "sha512-h5bgJWpxJNswbU7qCrV0tIKQCaS3blPDrqKWx+QxzuzL1zGUzij9XCWLrSLsJPu5t+eWA/ycetzYAO5IOMcWAQ==",
+ "bin": {
+ "lz-string": "bin/bin.js"
+ }
+ },
"node_modules/make-dir": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/make-dir/-/make-dir-3.1.0.tgz",
@@ -10058,6 +10683,26 @@
"url": "https://github.com/sponsors/wooorm"
}
},
+ "node_modules/markdown-extensions": {
+ "version": "1.1.1",
+ "resolved": "https://registry.npmjs.org/markdown-extensions/-/markdown-extensions-1.1.1.tgz",
+ "integrity": "sha512-WWC0ZuMzCyDHYCasEGs4IPvLyTGftYwh6wIEOULOF0HXcqZlhwRzrK0w2VUlxWA98xnvb/jszw4ZSkJ6ADpM6Q==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/markdown-table": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/markdown-table/-/markdown-table-2.0.0.tgz",
+ "integrity": "sha512-Ezda85ToJUBhM6WGaG6veasyym+Tbs3cMAw/ZhOPqXiYsr0jgocBV3j3nx+4lk47plLlIqjwuTm/ywVI+zjJ/A==",
+ "dependencies": {
+ "repeat-string": "^1.0.0"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
"node_modules/md5.js": {
"version": "1.3.5",
"resolved": "https://registry.npmjs.org/md5.js/-/md5.js-1.3.5.tgz",
@@ -10093,6 +10738,20 @@
"url": "https://opencollective.com/unified"
}
},
+ "node_modules/mdast-util-find-and-replace": {
+ "version": "1.1.1",
+ "resolved": "https://registry.npmjs.org/mdast-util-find-and-replace/-/mdast-util-find-and-replace-1.1.1.tgz",
+ "integrity": "sha512-9cKl33Y21lyckGzpSmEQnIDjEfeeWelN5s1kUW1LwdB0Fkuq2u+4GdqcGEygYxJE8GVqCl0741bYXHgamfWAZA==",
+ "dependencies": {
+ "escape-string-regexp": "^4.0.0",
+ "unist-util-is": "^4.0.0",
+ "unist-util-visit-parents": "^3.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
"node_modules/mdast-util-from-markdown": {
"version": "1.3.1",
"resolved": "https://registry.npmjs.org/mdast-util-from-markdown/-/mdast-util-from-markdown-1.3.1.tgz",
@@ -10128,6 +10787,652 @@
"url": "https://opencollective.com/unified"
}
},
+ "node_modules/mdast-util-gfm": {
+ "version": "0.1.2",
+ "resolved": "https://registry.npmjs.org/mdast-util-gfm/-/mdast-util-gfm-0.1.2.tgz",
+ "integrity": "sha512-NNkhDx/qYcuOWB7xHUGWZYVXvjPFFd6afg6/e2g+SV4r9q5XUcCbV4Wfa3DLYIiD+xAEZc6K4MGaE/m0KDcPwQ==",
+ "dependencies": {
+ "mdast-util-gfm-autolink-literal": "^0.1.0",
+ "mdast-util-gfm-strikethrough": "^0.2.0",
+ "mdast-util-gfm-table": "^0.1.0",
+ "mdast-util-gfm-task-list-item": "^0.1.0",
+ "mdast-util-to-markdown": "^0.6.1"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-gfm-autolink-literal": {
+ "version": "0.1.3",
+ "resolved": "https://registry.npmjs.org/mdast-util-gfm-autolink-literal/-/mdast-util-gfm-autolink-literal-0.1.3.tgz",
+ "integrity": "sha512-GjmLjWrXg1wqMIO9+ZsRik/s7PLwTaeCHVB7vRxUwLntZc8mzmTsLVr6HW1yLokcnhfURsn5zmSVdi3/xWWu1A==",
+ "dependencies": {
+ "ccount": "^1.0.0",
+ "mdast-util-find-and-replace": "^1.1.0",
+ "micromark": "^2.11.3"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-gfm-autolink-literal/node_modules/micromark": {
+ "version": "2.11.4",
+ "resolved": "https://registry.npmjs.org/micromark/-/micromark-2.11.4.tgz",
+ "integrity": "sha512-+WoovN/ppKolQOFIAajxi7Lu9kInbPxFuTBVEavFcL8eAfVstoc5MocPmqBeAdBOJV00uaVjegzH4+MA0DN/uA==",
+ "funding": [
+ {
+ "type": "GitHub Sponsors",
+ "url": "https://github.com/sponsors/unifiedjs"
+ },
+ {
+ "type": "OpenCollective",
+ "url": "https://opencollective.com/unified"
+ }
+ ],
+ "dependencies": {
+ "debug": "^4.0.0",
+ "parse-entities": "^2.0.0"
+ }
+ },
+ "node_modules/mdast-util-gfm-strikethrough": {
+ "version": "0.2.3",
+ "resolved": "https://registry.npmjs.org/mdast-util-gfm-strikethrough/-/mdast-util-gfm-strikethrough-0.2.3.tgz",
+ "integrity": "sha512-5OQLXpt6qdbttcDG/UxYY7Yjj3e8P7X16LzvpX8pIQPYJ/C2Z1qFGMmcw+1PZMUM3Z8wt8NRfYTvCni93mgsgA==",
+ "dependencies": {
+ "mdast-util-to-markdown": "^0.6.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-gfm-table": {
+ "version": "0.1.6",
+ "resolved": "https://registry.npmjs.org/mdast-util-gfm-table/-/mdast-util-gfm-table-0.1.6.tgz",
+ "integrity": "sha512-j4yDxQ66AJSBwGkbpFEp9uG/LS1tZV3P33fN1gkyRB2LoRL+RR3f76m0HPHaby6F4Z5xr9Fv1URmATlRRUIpRQ==",
+ "dependencies": {
+ "markdown-table": "^2.0.0",
+ "mdast-util-to-markdown": "~0.6.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-gfm-task-list-item": {
+ "version": "0.1.6",
+ "resolved": "https://registry.npmjs.org/mdast-util-gfm-task-list-item/-/mdast-util-gfm-task-list-item-0.1.6.tgz",
+ "integrity": "sha512-/d51FFIfPsSmCIRNp7E6pozM9z1GYPIkSy1urQ8s/o4TC22BZ7DqfHFWiqBD23bc7J3vV1Fc9O4QIHBlfuit8A==",
+ "dependencies": {
+ "mdast-util-to-markdown": "~0.6.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdx": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/mdast-util-mdx/-/mdast-util-mdx-2.0.1.tgz",
+ "integrity": "sha512-38w5y+r8nyKlGvNjSEqWrhG0w5PmnRA+wnBvm+ulYCct7nsGYhFVb0lljS9bQav4psDAS1eGkP2LMVcZBi/aqw==",
+ "dependencies": {
+ "mdast-util-from-markdown": "^1.0.0",
+ "mdast-util-mdx-expression": "^1.0.0",
+ "mdast-util-mdx-jsx": "^2.0.0",
+ "mdast-util-mdxjs-esm": "^1.0.0",
+ "mdast-util-to-markdown": "^1.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdx-expression": {
+ "version": "1.3.2",
+ "resolved": "https://registry.npmjs.org/mdast-util-mdx-expression/-/mdast-util-mdx-expression-1.3.2.tgz",
+ "integrity": "sha512-xIPmR5ReJDu/DHH1OoIT1HkuybIfRGYRywC+gJtI7qHjCJp/M9jrmBEJW22O8lskDWm562BX2W8TiAwRTb0rKA==",
+ "dependencies": {
+ "@types/estree-jsx": "^1.0.0",
+ "@types/hast": "^2.0.0",
+ "@types/mdast": "^3.0.0",
+ "mdast-util-from-markdown": "^1.0.0",
+ "mdast-util-to-markdown": "^1.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdx-expression/node_modules/longest-streak": {
+ "version": "3.1.0",
+ "resolved": "https://registry.npmjs.org/longest-streak/-/longest-streak-3.1.0.tgz",
+ "integrity": "sha512-9Ri+o0JYgehTaVBBDoMqIl8GXtbWg711O3srftcHhZ0dqnETqLaoIK0x17fUw9rFSlK/0NlsKe0Ahhyl5pXE2g==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/mdast-util-mdx-expression/node_modules/mdast-util-to-markdown": {
+ "version": "1.5.0",
+ "resolved": "https://registry.npmjs.org/mdast-util-to-markdown/-/mdast-util-to-markdown-1.5.0.tgz",
+ "integrity": "sha512-bbv7TPv/WC49thZPg3jXuqzuvI45IL2EVAr/KxF0BSdHsU0ceFHOmwQn6evxAh1GaoK/6GQ1wp4R4oW2+LFL/A==",
+ "dependencies": {
+ "@types/mdast": "^3.0.0",
+ "@types/unist": "^2.0.0",
+ "longest-streak": "^3.0.0",
+ "mdast-util-phrasing": "^3.0.0",
+ "mdast-util-to-string": "^3.0.0",
+ "micromark-util-decode-string": "^1.0.0",
+ "unist-util-visit": "^4.0.0",
+ "zwitch": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdx-expression/node_modules/mdast-util-to-string": {
+ "version": "3.2.0",
+ "resolved": "https://registry.npmjs.org/mdast-util-to-string/-/mdast-util-to-string-3.2.0.tgz",
+ "integrity": "sha512-V4Zn/ncyN1QNSqSBxTrMOLpjr+IKdHl2v3KVLoWmDPscP4r9GcCi71gjgvUV1SFSKh92AjAG4peFuBl2/YgCJg==",
+ "dependencies": {
+ "@types/mdast": "^3.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdx-expression/node_modules/unist-util-is": {
+ "version": "5.2.1",
+ "resolved": "https://registry.npmjs.org/unist-util-is/-/unist-util-is-5.2.1.tgz",
+ "integrity": "sha512-u9njyyfEh43npf1M+yGKDGVPbY/JWEemg5nH05ncKPfi+kBbKBJoTdsogMu33uhytuLlv9y0O7GH7fEdwLdLQw==",
+ "dependencies": {
+ "@types/unist": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdx-expression/node_modules/unist-util-visit": {
+ "version": "4.1.2",
+ "resolved": "https://registry.npmjs.org/unist-util-visit/-/unist-util-visit-4.1.2.tgz",
+ "integrity": "sha512-MSd8OUGISqHdVvfY9TPhyK2VdUrPgxkUtWSuMHF6XAAFuL4LokseigBnZtPnJMu+FbynTkFNnFlyjxpVKujMRg==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-is": "^5.0.0",
+ "unist-util-visit-parents": "^5.1.1"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdx-expression/node_modules/unist-util-visit-parents": {
+ "version": "5.1.3",
+ "resolved": "https://registry.npmjs.org/unist-util-visit-parents/-/unist-util-visit-parents-5.1.3.tgz",
+ "integrity": "sha512-x6+y8g7wWMyQhL1iZfhIPhDAs7Xwbn9nRosDXl7qoPTSCy0yNxnKc+hWokFifWQIDGi154rdUqKvbCa4+1kLhg==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-is": "^5.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdx-expression/node_modules/zwitch": {
+ "version": "2.0.4",
+ "resolved": "https://registry.npmjs.org/zwitch/-/zwitch-2.0.4.tgz",
+ "integrity": "sha512-bXE4cR/kVZhKZX/RjPEflHaKVhUVl85noU3v6b8apfQEc1x4A+zBxjZ4lN8LqGd6WZ3dl98pY4o717VFmoPp+A==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/mdast-util-mdx-jsx": {
+ "version": "2.1.4",
+ "resolved": "https://registry.npmjs.org/mdast-util-mdx-jsx/-/mdast-util-mdx-jsx-2.1.4.tgz",
+ "integrity": "sha512-DtMn9CmVhVzZx3f+optVDF8yFgQVt7FghCRNdlIaS3X5Bnym3hZwPbg/XW86vdpKjlc1PVj26SpnLGeJBXD3JA==",
+ "dependencies": {
+ "@types/estree-jsx": "^1.0.0",
+ "@types/hast": "^2.0.0",
+ "@types/mdast": "^3.0.0",
+ "@types/unist": "^2.0.0",
+ "ccount": "^2.0.0",
+ "mdast-util-from-markdown": "^1.1.0",
+ "mdast-util-to-markdown": "^1.3.0",
+ "parse-entities": "^4.0.0",
+ "stringify-entities": "^4.0.0",
+ "unist-util-remove-position": "^4.0.0",
+ "unist-util-stringify-position": "^3.0.0",
+ "vfile-message": "^3.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdx-jsx/node_modules/ccount": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/ccount/-/ccount-2.0.1.tgz",
+ "integrity": "sha512-eyrF0jiFpY+3drT6383f1qhkbGsLSifNAjA61IUjZjmLCWjItY6LB9ft9YhoDgwfmclB2zhu51Lc7+95b8NRAg==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/mdast-util-mdx-jsx/node_modules/character-entities-legacy": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/character-entities-legacy/-/character-entities-legacy-3.0.0.tgz",
+ "integrity": "sha512-RpPp0asT/6ufRm//AJVwpViZbGM/MkjQFxJccQRHmISF/22NBtsHqAWmL+/pmkPWoIUJdWyeVleTl1wydHATVQ==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/mdast-util-mdx-jsx/node_modules/character-reference-invalid": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/character-reference-invalid/-/character-reference-invalid-2.0.1.tgz",
+ "integrity": "sha512-iBZ4F4wRbyORVsu0jPV7gXkOsGYjGHPmAyv+HiHG8gi5PtC9KI2j1+v8/tlibRvjoWX027ypmG/n0HtO5t7unw==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/mdast-util-mdx-jsx/node_modules/is-alphabetical": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/is-alphabetical/-/is-alphabetical-2.0.1.tgz",
+ "integrity": "sha512-FWyyY60MeTNyeSRpkM2Iry0G9hpr7/9kD40mD/cGQEuilcZYS4okz8SN2Q6rLCJ8gbCt6fN+rC+6tMGS99LaxQ==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/mdast-util-mdx-jsx/node_modules/is-alphanumerical": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/is-alphanumerical/-/is-alphanumerical-2.0.1.tgz",
+ "integrity": "sha512-hmbYhX/9MUMF5uh7tOXyK/n0ZvWpad5caBA17GsC6vyuCqaWliRG5K1qS9inmUhEMaOBIW7/whAnSwveW/LtZw==",
+ "dependencies": {
+ "is-alphabetical": "^2.0.0",
+ "is-decimal": "^2.0.0"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/mdast-util-mdx-jsx/node_modules/is-decimal": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/is-decimal/-/is-decimal-2.0.1.tgz",
+ "integrity": "sha512-AAB9hiomQs5DXWcRB1rqsxGUstbRroFOPPVAomNk/3XHR5JyEZChOyTWe2oayKnsSsr/kcGqF+z6yuH6HHpN0A==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/mdast-util-mdx-jsx/node_modules/is-hexadecimal": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/is-hexadecimal/-/is-hexadecimal-2.0.1.tgz",
+ "integrity": "sha512-DgZQp241c8oO6cA1SbTEWiXeoxV42vlcJxgH+B3hi1AiqqKruZR3ZGF8In3fj4+/y/7rHvlOZLZtgJ/4ttYGZg==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/mdast-util-mdx-jsx/node_modules/longest-streak": {
+ "version": "3.1.0",
+ "resolved": "https://registry.npmjs.org/longest-streak/-/longest-streak-3.1.0.tgz",
+ "integrity": "sha512-9Ri+o0JYgehTaVBBDoMqIl8GXtbWg711O3srftcHhZ0dqnETqLaoIK0x17fUw9rFSlK/0NlsKe0Ahhyl5pXE2g==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/mdast-util-mdx-jsx/node_modules/mdast-util-to-markdown": {
+ "version": "1.5.0",
+ "resolved": "https://registry.npmjs.org/mdast-util-to-markdown/-/mdast-util-to-markdown-1.5.0.tgz",
+ "integrity": "sha512-bbv7TPv/WC49thZPg3jXuqzuvI45IL2EVAr/KxF0BSdHsU0ceFHOmwQn6evxAh1GaoK/6GQ1wp4R4oW2+LFL/A==",
+ "dependencies": {
+ "@types/mdast": "^3.0.0",
+ "@types/unist": "^2.0.0",
+ "longest-streak": "^3.0.0",
+ "mdast-util-phrasing": "^3.0.0",
+ "mdast-util-to-string": "^3.0.0",
+ "micromark-util-decode-string": "^1.0.0",
+ "unist-util-visit": "^4.0.0",
+ "zwitch": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdx-jsx/node_modules/mdast-util-to-string": {
+ "version": "3.2.0",
+ "resolved": "https://registry.npmjs.org/mdast-util-to-string/-/mdast-util-to-string-3.2.0.tgz",
+ "integrity": "sha512-V4Zn/ncyN1QNSqSBxTrMOLpjr+IKdHl2v3KVLoWmDPscP4r9GcCi71gjgvUV1SFSKh92AjAG4peFuBl2/YgCJg==",
+ "dependencies": {
+ "@types/mdast": "^3.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdx-jsx/node_modules/parse-entities": {
+ "version": "4.0.1",
+ "resolved": "https://registry.npmjs.org/parse-entities/-/parse-entities-4.0.1.tgz",
+ "integrity": "sha512-SWzvYcSJh4d/SGLIOQfZ/CoNv6BTlI6YEQ7Nj82oDVnRpwe/Z/F1EMx42x3JAOwGBlCjeCH0BRJQbQ/opHL17w==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "character-entities": "^2.0.0",
+ "character-entities-legacy": "^3.0.0",
+ "character-reference-invalid": "^2.0.0",
+ "decode-named-character-reference": "^1.0.0",
+ "is-alphanumerical": "^2.0.0",
+ "is-decimal": "^2.0.0",
+ "is-hexadecimal": "^2.0.0"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/mdast-util-mdx-jsx/node_modules/unist-util-is": {
+ "version": "5.2.1",
+ "resolved": "https://registry.npmjs.org/unist-util-is/-/unist-util-is-5.2.1.tgz",
+ "integrity": "sha512-u9njyyfEh43npf1M+yGKDGVPbY/JWEemg5nH05ncKPfi+kBbKBJoTdsogMu33uhytuLlv9y0O7GH7fEdwLdLQw==",
+ "dependencies": {
+ "@types/unist": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdx-jsx/node_modules/unist-util-remove-position": {
+ "version": "4.0.2",
+ "resolved": "https://registry.npmjs.org/unist-util-remove-position/-/unist-util-remove-position-4.0.2.tgz",
+ "integrity": "sha512-TkBb0HABNmxzAcfLf4qsIbFbaPDvMO6wa3b3j4VcEzFVaw1LBKwnW4/sRJ/atSLSzoIg41JWEdnE7N6DIhGDGQ==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-visit": "^4.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdx-jsx/node_modules/unist-util-visit": {
+ "version": "4.1.2",
+ "resolved": "https://registry.npmjs.org/unist-util-visit/-/unist-util-visit-4.1.2.tgz",
+ "integrity": "sha512-MSd8OUGISqHdVvfY9TPhyK2VdUrPgxkUtWSuMHF6XAAFuL4LokseigBnZtPnJMu+FbynTkFNnFlyjxpVKujMRg==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-is": "^5.0.0",
+ "unist-util-visit-parents": "^5.1.1"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdx-jsx/node_modules/unist-util-visit-parents": {
+ "version": "5.1.3",
+ "resolved": "https://registry.npmjs.org/unist-util-visit-parents/-/unist-util-visit-parents-5.1.3.tgz",
+ "integrity": "sha512-x6+y8g7wWMyQhL1iZfhIPhDAs7Xwbn9nRosDXl7qoPTSCy0yNxnKc+hWokFifWQIDGi154rdUqKvbCa4+1kLhg==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-is": "^5.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdx-jsx/node_modules/vfile-message": {
+ "version": "3.1.4",
+ "resolved": "https://registry.npmjs.org/vfile-message/-/vfile-message-3.1.4.tgz",
+ "integrity": "sha512-fa0Z6P8HUrQN4BZaX05SIVXic+7kE3b05PWAtPuYP9QLHsLKYR7/AlLW3NtOrpXRLeawpDLMsVkmk5DG0NXgWw==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-stringify-position": "^3.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdx-jsx/node_modules/zwitch": {
+ "version": "2.0.4",
+ "resolved": "https://registry.npmjs.org/zwitch/-/zwitch-2.0.4.tgz",
+ "integrity": "sha512-bXE4cR/kVZhKZX/RjPEflHaKVhUVl85noU3v6b8apfQEc1x4A+zBxjZ4lN8LqGd6WZ3dl98pY4o717VFmoPp+A==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/mdast-util-mdx/node_modules/longest-streak": {
+ "version": "3.1.0",
+ "resolved": "https://registry.npmjs.org/longest-streak/-/longest-streak-3.1.0.tgz",
+ "integrity": "sha512-9Ri+o0JYgehTaVBBDoMqIl8GXtbWg711O3srftcHhZ0dqnETqLaoIK0x17fUw9rFSlK/0NlsKe0Ahhyl5pXE2g==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/mdast-util-mdx/node_modules/mdast-util-to-markdown": {
+ "version": "1.5.0",
+ "resolved": "https://registry.npmjs.org/mdast-util-to-markdown/-/mdast-util-to-markdown-1.5.0.tgz",
+ "integrity": "sha512-bbv7TPv/WC49thZPg3jXuqzuvI45IL2EVAr/KxF0BSdHsU0ceFHOmwQn6evxAh1GaoK/6GQ1wp4R4oW2+LFL/A==",
+ "dependencies": {
+ "@types/mdast": "^3.0.0",
+ "@types/unist": "^2.0.0",
+ "longest-streak": "^3.0.0",
+ "mdast-util-phrasing": "^3.0.0",
+ "mdast-util-to-string": "^3.0.0",
+ "micromark-util-decode-string": "^1.0.0",
+ "unist-util-visit": "^4.0.0",
+ "zwitch": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdx/node_modules/mdast-util-to-string": {
+ "version": "3.2.0",
+ "resolved": "https://registry.npmjs.org/mdast-util-to-string/-/mdast-util-to-string-3.2.0.tgz",
+ "integrity": "sha512-V4Zn/ncyN1QNSqSBxTrMOLpjr+IKdHl2v3KVLoWmDPscP4r9GcCi71gjgvUV1SFSKh92AjAG4peFuBl2/YgCJg==",
+ "dependencies": {
+ "@types/mdast": "^3.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdx/node_modules/unist-util-is": {
+ "version": "5.2.1",
+ "resolved": "https://registry.npmjs.org/unist-util-is/-/unist-util-is-5.2.1.tgz",
+ "integrity": "sha512-u9njyyfEh43npf1M+yGKDGVPbY/JWEemg5nH05ncKPfi+kBbKBJoTdsogMu33uhytuLlv9y0O7GH7fEdwLdLQw==",
+ "dependencies": {
+ "@types/unist": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdx/node_modules/unist-util-visit": {
+ "version": "4.1.2",
+ "resolved": "https://registry.npmjs.org/unist-util-visit/-/unist-util-visit-4.1.2.tgz",
+ "integrity": "sha512-MSd8OUGISqHdVvfY9TPhyK2VdUrPgxkUtWSuMHF6XAAFuL4LokseigBnZtPnJMu+FbynTkFNnFlyjxpVKujMRg==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-is": "^5.0.0",
+ "unist-util-visit-parents": "^5.1.1"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdx/node_modules/unist-util-visit-parents": {
+ "version": "5.1.3",
+ "resolved": "https://registry.npmjs.org/unist-util-visit-parents/-/unist-util-visit-parents-5.1.3.tgz",
+ "integrity": "sha512-x6+y8g7wWMyQhL1iZfhIPhDAs7Xwbn9nRosDXl7qoPTSCy0yNxnKc+hWokFifWQIDGi154rdUqKvbCa4+1kLhg==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-is": "^5.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdx/node_modules/zwitch": {
+ "version": "2.0.4",
+ "resolved": "https://registry.npmjs.org/zwitch/-/zwitch-2.0.4.tgz",
+ "integrity": "sha512-bXE4cR/kVZhKZX/RjPEflHaKVhUVl85noU3v6b8apfQEc1x4A+zBxjZ4lN8LqGd6WZ3dl98pY4o717VFmoPp+A==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/mdast-util-mdxjs-esm": {
+ "version": "1.3.1",
+ "resolved": "https://registry.npmjs.org/mdast-util-mdxjs-esm/-/mdast-util-mdxjs-esm-1.3.1.tgz",
+ "integrity": "sha512-SXqglS0HrEvSdUEfoXFtcg7DRl7S2cwOXc7jkuusG472Mmjag34DUDeOJUZtl+BVnyeO1frIgVpHlNRWc2gk/w==",
+ "dependencies": {
+ "@types/estree-jsx": "^1.0.0",
+ "@types/hast": "^2.0.0",
+ "@types/mdast": "^3.0.0",
+ "mdast-util-from-markdown": "^1.0.0",
+ "mdast-util-to-markdown": "^1.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdxjs-esm/node_modules/longest-streak": {
+ "version": "3.1.0",
+ "resolved": "https://registry.npmjs.org/longest-streak/-/longest-streak-3.1.0.tgz",
+ "integrity": "sha512-9Ri+o0JYgehTaVBBDoMqIl8GXtbWg711O3srftcHhZ0dqnETqLaoIK0x17fUw9rFSlK/0NlsKe0Ahhyl5pXE2g==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/mdast-util-mdxjs-esm/node_modules/mdast-util-to-markdown": {
+ "version": "1.5.0",
+ "resolved": "https://registry.npmjs.org/mdast-util-to-markdown/-/mdast-util-to-markdown-1.5.0.tgz",
+ "integrity": "sha512-bbv7TPv/WC49thZPg3jXuqzuvI45IL2EVAr/KxF0BSdHsU0ceFHOmwQn6evxAh1GaoK/6GQ1wp4R4oW2+LFL/A==",
+ "dependencies": {
+ "@types/mdast": "^3.0.0",
+ "@types/unist": "^2.0.0",
+ "longest-streak": "^3.0.0",
+ "mdast-util-phrasing": "^3.0.0",
+ "mdast-util-to-string": "^3.0.0",
+ "micromark-util-decode-string": "^1.0.0",
+ "unist-util-visit": "^4.0.0",
+ "zwitch": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdxjs-esm/node_modules/mdast-util-to-string": {
+ "version": "3.2.0",
+ "resolved": "https://registry.npmjs.org/mdast-util-to-string/-/mdast-util-to-string-3.2.0.tgz",
+ "integrity": "sha512-V4Zn/ncyN1QNSqSBxTrMOLpjr+IKdHl2v3KVLoWmDPscP4r9GcCi71gjgvUV1SFSKh92AjAG4peFuBl2/YgCJg==",
+ "dependencies": {
+ "@types/mdast": "^3.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdxjs-esm/node_modules/unist-util-is": {
+ "version": "5.2.1",
+ "resolved": "https://registry.npmjs.org/unist-util-is/-/unist-util-is-5.2.1.tgz",
+ "integrity": "sha512-u9njyyfEh43npf1M+yGKDGVPbY/JWEemg5nH05ncKPfi+kBbKBJoTdsogMu33uhytuLlv9y0O7GH7fEdwLdLQw==",
+ "dependencies": {
+ "@types/unist": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdxjs-esm/node_modules/unist-util-visit": {
+ "version": "4.1.2",
+ "resolved": "https://registry.npmjs.org/unist-util-visit/-/unist-util-visit-4.1.2.tgz",
+ "integrity": "sha512-MSd8OUGISqHdVvfY9TPhyK2VdUrPgxkUtWSuMHF6XAAFuL4LokseigBnZtPnJMu+FbynTkFNnFlyjxpVKujMRg==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-is": "^5.0.0",
+ "unist-util-visit-parents": "^5.1.1"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdxjs-esm/node_modules/unist-util-visit-parents": {
+ "version": "5.1.3",
+ "resolved": "https://registry.npmjs.org/unist-util-visit-parents/-/unist-util-visit-parents-5.1.3.tgz",
+ "integrity": "sha512-x6+y8g7wWMyQhL1iZfhIPhDAs7Xwbn9nRosDXl7qoPTSCy0yNxnKc+hWokFifWQIDGi154rdUqKvbCa4+1kLhg==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-is": "^5.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-mdxjs-esm/node_modules/zwitch": {
+ "version": "2.0.4",
+ "resolved": "https://registry.npmjs.org/zwitch/-/zwitch-2.0.4.tgz",
+ "integrity": "sha512-bXE4cR/kVZhKZX/RjPEflHaKVhUVl85noU3v6b8apfQEc1x4A+zBxjZ4lN8LqGd6WZ3dl98pY4o717VFmoPp+A==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/mdast-util-phrasing": {
+ "version": "3.0.1",
+ "resolved": "https://registry.npmjs.org/mdast-util-phrasing/-/mdast-util-phrasing-3.0.1.tgz",
+ "integrity": "sha512-WmI1gTXUBJo4/ZmSk79Wcb2HcjPJBzM1nlI/OUWA8yk2X9ik3ffNbBGsU+09BFmXaL1IBb9fiuvq6/KMiNycSg==",
+ "dependencies": {
+ "@types/mdast": "^3.0.0",
+ "unist-util-is": "^5.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-phrasing/node_modules/unist-util-is": {
+ "version": "5.2.1",
+ "resolved": "https://registry.npmjs.org/unist-util-is/-/unist-util-is-5.2.1.tgz",
+ "integrity": "sha512-u9njyyfEh43npf1M+yGKDGVPbY/JWEemg5nH05ncKPfi+kBbKBJoTdsogMu33uhytuLlv9y0O7GH7fEdwLdLQw==",
+ "dependencies": {
+ "@types/unist": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
"node_modules/mdast-util-to-hast": {
"version": "10.0.1",
"resolved": "https://registry.npmjs.org/mdast-util-to-hast/-/mdast-util-to-hast-10.0.1.tgz",
@@ -10147,6 +11452,23 @@
"url": "https://opencollective.com/unified"
}
},
+ "node_modules/mdast-util-to-markdown": {
+ "version": "0.6.5",
+ "resolved": "https://registry.npmjs.org/mdast-util-to-markdown/-/mdast-util-to-markdown-0.6.5.tgz",
+ "integrity": "sha512-XeV9sDE7ZlOQvs45C9UKMtfTcctcaj/pGwH8YLbMHoMOXNNCn2LsqVQOqrF1+/NU8lKDAqozme9SCXWyo9oAcQ==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "longest-streak": "^2.0.0",
+ "mdast-util-to-string": "^2.0.0",
+ "parse-entities": "^2.0.0",
+ "repeat-string": "^1.0.0",
+ "zwitch": "^1.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
"node_modules/mdast-util-to-string": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/mdast-util-to-string/-/mdast-util-to-string-2.0.0.tgz",
@@ -10326,6 +11648,298 @@
"uvu": "^0.5.0"
}
},
+ "node_modules/micromark-extension-gfm": {
+ "version": "0.3.3",
+ "resolved": "https://registry.npmjs.org/micromark-extension-gfm/-/micromark-extension-gfm-0.3.3.tgz",
+ "integrity": "sha512-oVN4zv5/tAIA+l3GbMi7lWeYpJ14oQyJ3uEim20ktYFAcfX1x3LNlFGGlmrZHt7u9YlKExmyJdDGaTt6cMSR/A==",
+ "dependencies": {
+ "micromark": "~2.11.0",
+ "micromark-extension-gfm-autolink-literal": "~0.5.0",
+ "micromark-extension-gfm-strikethrough": "~0.6.5",
+ "micromark-extension-gfm-table": "~0.4.0",
+ "micromark-extension-gfm-tagfilter": "~0.3.0",
+ "micromark-extension-gfm-task-list-item": "~0.3.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/micromark-extension-gfm-autolink-literal": {
+ "version": "0.5.7",
+ "resolved": "https://registry.npmjs.org/micromark-extension-gfm-autolink-literal/-/micromark-extension-gfm-autolink-literal-0.5.7.tgz",
+ "integrity": "sha512-ePiDGH0/lhcngCe8FtH4ARFoxKTUelMp4L7Gg2pujYD5CSMb9PbblnyL+AAMud/SNMyusbS2XDSiPIRcQoNFAw==",
+ "dependencies": {
+ "micromark": "~2.11.3"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/micromark-extension-gfm-autolink-literal/node_modules/micromark": {
+ "version": "2.11.4",
+ "resolved": "https://registry.npmjs.org/micromark/-/micromark-2.11.4.tgz",
+ "integrity": "sha512-+WoovN/ppKolQOFIAajxi7Lu9kInbPxFuTBVEavFcL8eAfVstoc5MocPmqBeAdBOJV00uaVjegzH4+MA0DN/uA==",
+ "funding": [
+ {
+ "type": "GitHub Sponsors",
+ "url": "https://github.com/sponsors/unifiedjs"
+ },
+ {
+ "type": "OpenCollective",
+ "url": "https://opencollective.com/unified"
+ }
+ ],
+ "dependencies": {
+ "debug": "^4.0.0",
+ "parse-entities": "^2.0.0"
+ }
+ },
+ "node_modules/micromark-extension-gfm-strikethrough": {
+ "version": "0.6.5",
+ "resolved": "https://registry.npmjs.org/micromark-extension-gfm-strikethrough/-/micromark-extension-gfm-strikethrough-0.6.5.tgz",
+ "integrity": "sha512-PpOKlgokpQRwUesRwWEp+fHjGGkZEejj83k9gU5iXCbDG+XBA92BqnRKYJdfqfkrRcZRgGuPuXb7DaK/DmxOhw==",
+ "dependencies": {
+ "micromark": "~2.11.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/micromark-extension-gfm-strikethrough/node_modules/micromark": {
+ "version": "2.11.4",
+ "resolved": "https://registry.npmjs.org/micromark/-/micromark-2.11.4.tgz",
+ "integrity": "sha512-+WoovN/ppKolQOFIAajxi7Lu9kInbPxFuTBVEavFcL8eAfVstoc5MocPmqBeAdBOJV00uaVjegzH4+MA0DN/uA==",
+ "funding": [
+ {
+ "type": "GitHub Sponsors",
+ "url": "https://github.com/sponsors/unifiedjs"
+ },
+ {
+ "type": "OpenCollective",
+ "url": "https://opencollective.com/unified"
+ }
+ ],
+ "dependencies": {
+ "debug": "^4.0.0",
+ "parse-entities": "^2.0.0"
+ }
+ },
+ "node_modules/micromark-extension-gfm-table": {
+ "version": "0.4.3",
+ "resolved": "https://registry.npmjs.org/micromark-extension-gfm-table/-/micromark-extension-gfm-table-0.4.3.tgz",
+ "integrity": "sha512-hVGvESPq0fk6ALWtomcwmgLvH8ZSVpcPjzi0AjPclB9FsVRgMtGZkUcpE0zgjOCFAznKepF4z3hX8z6e3HODdA==",
+ "dependencies": {
+ "micromark": "~2.11.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/micromark-extension-gfm-table/node_modules/micromark": {
+ "version": "2.11.4",
+ "resolved": "https://registry.npmjs.org/micromark/-/micromark-2.11.4.tgz",
+ "integrity": "sha512-+WoovN/ppKolQOFIAajxi7Lu9kInbPxFuTBVEavFcL8eAfVstoc5MocPmqBeAdBOJV00uaVjegzH4+MA0DN/uA==",
+ "funding": [
+ {
+ "type": "GitHub Sponsors",
+ "url": "https://github.com/sponsors/unifiedjs"
+ },
+ {
+ "type": "OpenCollective",
+ "url": "https://opencollective.com/unified"
+ }
+ ],
+ "dependencies": {
+ "debug": "^4.0.0",
+ "parse-entities": "^2.0.0"
+ }
+ },
+ "node_modules/micromark-extension-gfm-tagfilter": {
+ "version": "0.3.0",
+ "resolved": "https://registry.npmjs.org/micromark-extension-gfm-tagfilter/-/micromark-extension-gfm-tagfilter-0.3.0.tgz",
+ "integrity": "sha512-9GU0xBatryXifL//FJH+tAZ6i240xQuFrSL7mYi8f4oZSbc+NvXjkrHemeYP0+L4ZUT+Ptz3b95zhUZnMtoi/Q==",
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/micromark-extension-gfm-task-list-item": {
+ "version": "0.3.3",
+ "resolved": "https://registry.npmjs.org/micromark-extension-gfm-task-list-item/-/micromark-extension-gfm-task-list-item-0.3.3.tgz",
+ "integrity": "sha512-0zvM5iSLKrc/NQl84pZSjGo66aTGd57C1idmlWmE87lkMcXrTxg1uXa/nXomxJytoje9trP0NDLvw4bZ/Z/XCQ==",
+ "dependencies": {
+ "micromark": "~2.11.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/micromark-extension-gfm-task-list-item/node_modules/micromark": {
+ "version": "2.11.4",
+ "resolved": "https://registry.npmjs.org/micromark/-/micromark-2.11.4.tgz",
+ "integrity": "sha512-+WoovN/ppKolQOFIAajxi7Lu9kInbPxFuTBVEavFcL8eAfVstoc5MocPmqBeAdBOJV00uaVjegzH4+MA0DN/uA==",
+ "funding": [
+ {
+ "type": "GitHub Sponsors",
+ "url": "https://github.com/sponsors/unifiedjs"
+ },
+ {
+ "type": "OpenCollective",
+ "url": "https://opencollective.com/unified"
+ }
+ ],
+ "dependencies": {
+ "debug": "^4.0.0",
+ "parse-entities": "^2.0.0"
+ }
+ },
+ "node_modules/micromark-extension-gfm/node_modules/micromark": {
+ "version": "2.11.4",
+ "resolved": "https://registry.npmjs.org/micromark/-/micromark-2.11.4.tgz",
+ "integrity": "sha512-+WoovN/ppKolQOFIAajxi7Lu9kInbPxFuTBVEavFcL8eAfVstoc5MocPmqBeAdBOJV00uaVjegzH4+MA0DN/uA==",
+ "funding": [
+ {
+ "type": "GitHub Sponsors",
+ "url": "https://github.com/sponsors/unifiedjs"
+ },
+ {
+ "type": "OpenCollective",
+ "url": "https://opencollective.com/unified"
+ }
+ ],
+ "dependencies": {
+ "debug": "^4.0.0",
+ "parse-entities": "^2.0.0"
+ }
+ },
+ "node_modules/micromark-extension-mdx-expression": {
+ "version": "1.0.8",
+ "resolved": "https://registry.npmjs.org/micromark-extension-mdx-expression/-/micromark-extension-mdx-expression-1.0.8.tgz",
+ "integrity": "sha512-zZpeQtc5wfWKdzDsHRBY003H2Smg+PUi2REhqgIhdzAa5xonhP03FcXxqFSerFiNUr5AWmHpaNPQTBVOS4lrXw==",
+ "funding": [
+ {
+ "type": "GitHub Sponsors",
+ "url": "https://github.com/sponsors/unifiedjs"
+ },
+ {
+ "type": "OpenCollective",
+ "url": "https://opencollective.com/unified"
+ }
+ ],
+ "dependencies": {
+ "@types/estree": "^1.0.0",
+ "micromark-factory-mdx-expression": "^1.0.0",
+ "micromark-factory-space": "^1.0.0",
+ "micromark-util-character": "^1.0.0",
+ "micromark-util-events-to-acorn": "^1.0.0",
+ "micromark-util-symbol": "^1.0.0",
+ "micromark-util-types": "^1.0.0",
+ "uvu": "^0.5.0"
+ }
+ },
+ "node_modules/micromark-extension-mdx-jsx": {
+ "version": "1.0.5",
+ "resolved": "https://registry.npmjs.org/micromark-extension-mdx-jsx/-/micromark-extension-mdx-jsx-1.0.5.tgz",
+ "integrity": "sha512-gPH+9ZdmDflbu19Xkb8+gheqEDqkSpdCEubQyxuz/Hn8DOXiXvrXeikOoBA71+e8Pfi0/UYmU3wW3H58kr7akA==",
+ "dependencies": {
+ "@types/acorn": "^4.0.0",
+ "@types/estree": "^1.0.0",
+ "estree-util-is-identifier-name": "^2.0.0",
+ "micromark-factory-mdx-expression": "^1.0.0",
+ "micromark-factory-space": "^1.0.0",
+ "micromark-util-character": "^1.0.0",
+ "micromark-util-symbol": "^1.0.0",
+ "micromark-util-types": "^1.0.0",
+ "uvu": "^0.5.0",
+ "vfile-message": "^3.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/micromark-extension-mdx-jsx/node_modules/vfile-message": {
+ "version": "3.1.4",
+ "resolved": "https://registry.npmjs.org/vfile-message/-/vfile-message-3.1.4.tgz",
+ "integrity": "sha512-fa0Z6P8HUrQN4BZaX05SIVXic+7kE3b05PWAtPuYP9QLHsLKYR7/AlLW3NtOrpXRLeawpDLMsVkmk5DG0NXgWw==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-stringify-position": "^3.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/micromark-extension-mdx-md": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/micromark-extension-mdx-md/-/micromark-extension-mdx-md-1.0.1.tgz",
+ "integrity": "sha512-7MSuj2S7xjOQXAjjkbjBsHkMtb+mDGVW6uI2dBL9snOBCbZmoNgDAeZ0nSn9j3T42UE/g2xVNMn18PJxZvkBEA==",
+ "dependencies": {
+ "micromark-util-types": "^1.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/micromark-extension-mdxjs": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/micromark-extension-mdxjs/-/micromark-extension-mdxjs-1.0.1.tgz",
+ "integrity": "sha512-7YA7hF6i5eKOfFUzZ+0z6avRG52GpWR8DL+kN47y3f2KhxbBZMhmxe7auOeaTBrW2DenbbZTf1ea9tA2hDpC2Q==",
+ "dependencies": {
+ "acorn": "^8.0.0",
+ "acorn-jsx": "^5.0.0",
+ "micromark-extension-mdx-expression": "^1.0.0",
+ "micromark-extension-mdx-jsx": "^1.0.0",
+ "micromark-extension-mdx-md": "^1.0.0",
+ "micromark-extension-mdxjs-esm": "^1.0.0",
+ "micromark-util-combine-extensions": "^1.0.0",
+ "micromark-util-types": "^1.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/micromark-extension-mdxjs-esm": {
+ "version": "1.0.5",
+ "resolved": "https://registry.npmjs.org/micromark-extension-mdxjs-esm/-/micromark-extension-mdxjs-esm-1.0.5.tgz",
+ "integrity": "sha512-xNRBw4aoURcyz/S69B19WnZAkWJMxHMT5hE36GtDAyhoyn/8TuAeqjFJQlwk+MKQsUD7b3l7kFX+vlfVWgcX1w==",
+ "dependencies": {
+ "@types/estree": "^1.0.0",
+ "micromark-core-commonmark": "^1.0.0",
+ "micromark-util-character": "^1.0.0",
+ "micromark-util-events-to-acorn": "^1.0.0",
+ "micromark-util-symbol": "^1.0.0",
+ "micromark-util-types": "^1.0.0",
+ "unist-util-position-from-estree": "^1.1.0",
+ "uvu": "^0.5.0",
+ "vfile-message": "^3.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/micromark-extension-mdxjs-esm/node_modules/vfile-message": {
+ "version": "3.1.4",
+ "resolved": "https://registry.npmjs.org/vfile-message/-/vfile-message-3.1.4.tgz",
+ "integrity": "sha512-fa0Z6P8HUrQN4BZaX05SIVXic+7kE3b05PWAtPuYP9QLHsLKYR7/AlLW3NtOrpXRLeawpDLMsVkmk5DG0NXgWw==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-stringify-position": "^3.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
"node_modules/micromark-factory-destination": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/micromark-factory-destination/-/micromark-factory-destination-1.1.0.tgz",
@@ -10367,6 +11981,44 @@
"uvu": "^0.5.0"
}
},
+ "node_modules/micromark-factory-mdx-expression": {
+ "version": "1.0.9",
+ "resolved": "https://registry.npmjs.org/micromark-factory-mdx-expression/-/micromark-factory-mdx-expression-1.0.9.tgz",
+ "integrity": "sha512-jGIWzSmNfdnkJq05c7b0+Wv0Kfz3NJ3N4cBjnbO4zjXIlxJr+f8lk+5ZmwFvqdAbUy2q6B5rCY//g0QAAaXDWA==",
+ "funding": [
+ {
+ "type": "GitHub Sponsors",
+ "url": "https://github.com/sponsors/unifiedjs"
+ },
+ {
+ "type": "OpenCollective",
+ "url": "https://opencollective.com/unified"
+ }
+ ],
+ "dependencies": {
+ "@types/estree": "^1.0.0",
+ "micromark-util-character": "^1.0.0",
+ "micromark-util-events-to-acorn": "^1.0.0",
+ "micromark-util-symbol": "^1.0.0",
+ "micromark-util-types": "^1.0.0",
+ "unist-util-position-from-estree": "^1.0.0",
+ "uvu": "^0.5.0",
+ "vfile-message": "^3.0.0"
+ }
+ },
+ "node_modules/micromark-factory-mdx-expression/node_modules/vfile-message": {
+ "version": "3.1.4",
+ "resolved": "https://registry.npmjs.org/vfile-message/-/vfile-message-3.1.4.tgz",
+ "integrity": "sha512-fa0Z6P8HUrQN4BZaX05SIVXic+7kE3b05PWAtPuYP9QLHsLKYR7/AlLW3NtOrpXRLeawpDLMsVkmk5DG0NXgWw==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-stringify-position": "^3.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
"node_modules/micromark-factory-space": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/micromark-factory-space/-/micromark-factory-space-1.1.0.tgz",
@@ -10558,6 +12210,44 @@
}
]
},
+ "node_modules/micromark-util-events-to-acorn": {
+ "version": "1.2.3",
+ "resolved": "https://registry.npmjs.org/micromark-util-events-to-acorn/-/micromark-util-events-to-acorn-1.2.3.tgz",
+ "integrity": "sha512-ij4X7Wuc4fED6UoLWkmo0xJQhsktfNh1J0m8g4PbIMPlx+ek/4YdW5mvbye8z/aZvAPUoxgXHrwVlXAPKMRp1w==",
+ "funding": [
+ {
+ "type": "GitHub Sponsors",
+ "url": "https://github.com/sponsors/unifiedjs"
+ },
+ {
+ "type": "OpenCollective",
+ "url": "https://opencollective.com/unified"
+ }
+ ],
+ "dependencies": {
+ "@types/acorn": "^4.0.0",
+ "@types/estree": "^1.0.0",
+ "@types/unist": "^2.0.0",
+ "estree-util-visit": "^1.0.0",
+ "micromark-util-symbol": "^1.0.0",
+ "micromark-util-types": "^1.0.0",
+ "uvu": "^0.5.0",
+ "vfile-message": "^3.0.0"
+ }
+ },
+ "node_modules/micromark-util-events-to-acorn/node_modules/vfile-message": {
+ "version": "3.1.4",
+ "resolved": "https://registry.npmjs.org/vfile-message/-/vfile-message-3.1.4.tgz",
+ "integrity": "sha512-fa0Z6P8HUrQN4BZaX05SIVXic+7kE3b05PWAtPuYP9QLHsLKYR7/AlLW3NtOrpXRLeawpDLMsVkmk5DG0NXgWw==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-stringify-position": "^3.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
"node_modules/micromark-util-html-tag-name": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/micromark-util-html-tag-name/-/micromark-util-html-tag-name-1.2.0.tgz",
@@ -12117,6 +13807,16 @@
"resolved": "https://registry.npmjs.org/performance-now/-/performance-now-0.2.0.tgz",
"integrity": "sha512-YHk5ez1hmMR5LOkb9iJkLKqoBlL7WD5M8ljC75ZfzXriuBIVNuecaXuU7e+hOwyqf24Wxhh7Vxgt7Hnw9288Tg=="
},
+ "node_modules/periscopic": {
+ "version": "3.1.0",
+ "resolved": "https://registry.npmjs.org/periscopic/-/periscopic-3.1.0.tgz",
+ "integrity": "sha512-vKiQ8RRtkl9P+r/+oefh25C3fhybptkHKCZSPlcXiJux2tJF55GnEj3BVn4A5gKfq9NWWXXrxkHBwVPUfH0opw==",
+ "dependencies": {
+ "@types/estree": "^1.0.0",
+ "estree-walker": "^3.0.0",
+ "is-reference": "^3.0.0"
+ }
+ },
"node_modules/picocolors": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/picocolors/-/picocolors-1.0.0.tgz",
@@ -13957,6 +15657,56 @@
"jsesc": "bin/jsesc"
}
},
+ "node_modules/rehype-parse": {
+ "version": "6.0.2",
+ "resolved": "https://registry.npmjs.org/rehype-parse/-/rehype-parse-6.0.2.tgz",
+ "integrity": "sha512-0S3CpvpTAgGmnz8kiCyFLGuW5yA4OQhyNTm/nwPopZ7+PI11WnGl1TTWTGv/2hPEe/g2jRLlhVVSsoDH8waRug==",
+ "dependencies": {
+ "hast-util-from-parse5": "^5.0.0",
+ "parse5": "^5.0.0",
+ "xtend": "^4.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/rehype-parse/node_modules/hast-util-from-parse5": {
+ "version": "5.0.3",
+ "resolved": "https://registry.npmjs.org/hast-util-from-parse5/-/hast-util-from-parse5-5.0.3.tgz",
+ "integrity": "sha512-gOc8UB99F6eWVWFtM9jUikjN7QkWxB3nY0df5Z0Zq1/Nkwl5V4hAAsl0tmwlgWl/1shlTF8DnNYLO8X6wRV9pA==",
+ "dependencies": {
+ "ccount": "^1.0.3",
+ "hastscript": "^5.0.0",
+ "property-information": "^5.0.0",
+ "web-namespaces": "^1.1.2",
+ "xtend": "^4.0.1"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/rehype-parse/node_modules/hastscript": {
+ "version": "5.1.2",
+ "resolved": "https://registry.npmjs.org/hastscript/-/hastscript-5.1.2.tgz",
+ "integrity": "sha512-WlztFuK+Lrvi3EggsqOkQ52rKbxkXL3RwB6t5lwoa8QLMemoWfBuL43eDrwOamJyR7uKQKdmKYaBH1NZBiIRrQ==",
+ "dependencies": {
+ "comma-separated-tokens": "^1.0.0",
+ "hast-util-parse-selector": "^2.0.0",
+ "property-information": "^5.0.0",
+ "space-separated-tokens": "^1.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/rehype-parse/node_modules/parse5": {
+ "version": "5.1.1",
+ "resolved": "https://registry.npmjs.org/parse5/-/parse5-5.1.1.tgz",
+ "integrity": "sha512-ugq4DFI0Ptb+WWjAdOK16+u/nHfiIrcE+sh8kZMaM0WllQKLI9rOUq6c2b7cwPkXdzfQESqvoqK6ug7U/Yyzug=="
+ },
"node_modules/relateurl": {
"version": "0.2.7",
"resolved": "https://registry.npmjs.org/relateurl/-/relateurl-0.2.7.tgz",
@@ -13965,6 +15715,40 @@
"node": ">= 0.10"
}
},
+ "node_modules/remark-admonitions": {
+ "version": "1.2.1",
+ "resolved": "https://registry.npmjs.org/remark-admonitions/-/remark-admonitions-1.2.1.tgz",
+ "integrity": "sha512-Ji6p68VDvD+H1oS95Fdx9Ar5WA2wcDA4kwrrhVU7fGctC6+d3uiMICu7w7/2Xld+lnU7/gi+432+rRbup5S8ow==",
+ "dependencies": {
+ "rehype-parse": "^6.0.2",
+ "unified": "^8.4.2",
+ "unist-util-visit": "^2.0.1"
+ }
+ },
+ "node_modules/remark-admonitions/node_modules/is-plain-obj": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/is-plain-obj/-/is-plain-obj-2.1.0.tgz",
+ "integrity": "sha512-YWnfyRwxL/+SsrWYfOpUtz5b3YD+nyfkHvjbcanzk8zgyO4ASD67uVMRt8k5bM4lLMDnXfriRhOpemw+NfT1eA==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/remark-admonitions/node_modules/unified": {
+ "version": "8.4.2",
+ "resolved": "https://registry.npmjs.org/unified/-/unified-8.4.2.tgz",
+ "integrity": "sha512-JCrmN13jI4+h9UAyKEoGcDZV+i1E7BLFuG7OsaDvTXI5P0qhHX+vZO/kOhz9jn8HGENDKbwSeB0nVOg4gVStGA==",
+ "dependencies": {
+ "bail": "^1.0.0",
+ "extend": "^3.0.0",
+ "is-plain-obj": "^2.0.0",
+ "trough": "^1.0.0",
+ "vfile": "^4.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
"node_modules/remark-emoji": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/remark-emoji/-/remark-emoji-2.2.0.tgz",
@@ -13984,6 +15768,19 @@
"url": "https://opencollective.com/unified"
}
},
+ "node_modules/remark-gfm": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/remark-gfm/-/remark-gfm-1.0.0.tgz",
+ "integrity": "sha512-KfexHJCiqvrdBZVbQ6RopMZGwaXz6wFJEfByIuEwGf0arvITHjiKKZ1dpXujjH9KZdm1//XJQwgfnJ3lmXaDPA==",
+ "dependencies": {
+ "mdast-util-gfm": "^0.1.0",
+ "micromark-extension-gfm": "^0.3.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
"node_modules/remark-mdx": {
"version": "1.6.22",
"resolved": "https://registry.npmjs.org/remark-mdx/-/remark-mdx-1.6.22.tgz",
@@ -14206,6 +16003,189 @@
"url": "https://opencollective.com/unified"
}
},
+ "node_modules/remark-rehype": {
+ "version": "10.1.0",
+ "resolved": "https://registry.npmjs.org/remark-rehype/-/remark-rehype-10.1.0.tgz",
+ "integrity": "sha512-EFmR5zppdBp0WQeDVZ/b66CWJipB2q2VLNFMabzDSGR66Z2fQii83G5gTBbgGEnEEA0QRussvrFHxk1HWGJskw==",
+ "dependencies": {
+ "@types/hast": "^2.0.0",
+ "@types/mdast": "^3.0.0",
+ "mdast-util-to-hast": "^12.1.0",
+ "unified": "^10.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/remark-rehype/node_modules/bail": {
+ "version": "2.0.2",
+ "resolved": "https://registry.npmjs.org/bail/-/bail-2.0.2.tgz",
+ "integrity": "sha512-0xO6mYd7JB2YesxDKplafRpsiOzPt9V02ddPCLbY1xYGPOX24NTyN50qnUxgCPcSoYMhKpAuBTjQoRZCAkUDRw==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/remark-rehype/node_modules/is-plain-obj": {
+ "version": "4.1.0",
+ "resolved": "https://registry.npmjs.org/is-plain-obj/-/is-plain-obj-4.1.0.tgz",
+ "integrity": "sha512-+Pgi+vMuUNkJyExiMBt5IlFoMyKnr5zhJ4Uspz58WOhBF5QoIZkFyNHIbBAtHwzVAgk5RtndVNsDRN61/mmDqg==",
+ "engines": {
+ "node": ">=12"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/remark-rehype/node_modules/mdast-util-definitions": {
+ "version": "5.1.2",
+ "resolved": "https://registry.npmjs.org/mdast-util-definitions/-/mdast-util-definitions-5.1.2.tgz",
+ "integrity": "sha512-8SVPMuHqlPME/z3gqVwWY4zVXn8lqKv/pAhC57FuJ40ImXyBpmO5ukh98zB2v7Blql2FiHjHv9LVztSIqjY+MA==",
+ "dependencies": {
+ "@types/mdast": "^3.0.0",
+ "@types/unist": "^2.0.0",
+ "unist-util-visit": "^4.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/remark-rehype/node_modules/mdast-util-to-hast": {
+ "version": "12.3.0",
+ "resolved": "https://registry.npmjs.org/mdast-util-to-hast/-/mdast-util-to-hast-12.3.0.tgz",
+ "integrity": "sha512-pits93r8PhnIoU4Vy9bjW39M2jJ6/tdHyja9rrot9uujkN7UTU9SDnE6WNJz/IGyQk3XHX6yNNtrBH6cQzm8Hw==",
+ "dependencies": {
+ "@types/hast": "^2.0.0",
+ "@types/mdast": "^3.0.0",
+ "mdast-util-definitions": "^5.0.0",
+ "micromark-util-sanitize-uri": "^1.1.0",
+ "trim-lines": "^3.0.0",
+ "unist-util-generated": "^2.0.0",
+ "unist-util-position": "^4.0.0",
+ "unist-util-visit": "^4.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/remark-rehype/node_modules/trough": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/trough/-/trough-2.1.0.tgz",
+ "integrity": "sha512-AqTiAOLcj85xS7vQ8QkAV41hPDIJ71XJB4RCUrzo/1GM2CQwhkJGaf9Hgr7BOugMRpgGUrqRg/DrBDl4H40+8g==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/remark-rehype/node_modules/unified": {
+ "version": "10.1.2",
+ "resolved": "https://registry.npmjs.org/unified/-/unified-10.1.2.tgz",
+ "integrity": "sha512-pUSWAi/RAnVy1Pif2kAoeWNBa3JVrx0MId2LASj8G+7AiHWoKZNTomq6LG326T68U7/e263X6fTdcXIy7XnF7Q==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "bail": "^2.0.0",
+ "extend": "^3.0.0",
+ "is-buffer": "^2.0.0",
+ "is-plain-obj": "^4.0.0",
+ "trough": "^2.0.0",
+ "vfile": "^5.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/remark-rehype/node_modules/unist-util-generated": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/unist-util-generated/-/unist-util-generated-2.0.1.tgz",
+ "integrity": "sha512-qF72kLmPxAw0oN2fwpWIqbXAVyEqUzDHMsbtPvOudIlUzXYFIeQIuxXQCRCFh22B7cixvU0MG7m3MW8FTq/S+A==",
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/remark-rehype/node_modules/unist-util-is": {
+ "version": "5.2.1",
+ "resolved": "https://registry.npmjs.org/unist-util-is/-/unist-util-is-5.2.1.tgz",
+ "integrity": "sha512-u9njyyfEh43npf1M+yGKDGVPbY/JWEemg5nH05ncKPfi+kBbKBJoTdsogMu33uhytuLlv9y0O7GH7fEdwLdLQw==",
+ "dependencies": {
+ "@types/unist": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/remark-rehype/node_modules/unist-util-position": {
+ "version": "4.0.4",
+ "resolved": "https://registry.npmjs.org/unist-util-position/-/unist-util-position-4.0.4.tgz",
+ "integrity": "sha512-kUBE91efOWfIVBo8xzh/uZQ7p9ffYRtUbMRZBNFYwf0RK8koUMx6dGUfwylLOKmaT2cs4wSW96QoYUSXAyEtpg==",
+ "dependencies": {
+ "@types/unist": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/remark-rehype/node_modules/unist-util-visit": {
+ "version": "4.1.2",
+ "resolved": "https://registry.npmjs.org/unist-util-visit/-/unist-util-visit-4.1.2.tgz",
+ "integrity": "sha512-MSd8OUGISqHdVvfY9TPhyK2VdUrPgxkUtWSuMHF6XAAFuL4LokseigBnZtPnJMu+FbynTkFNnFlyjxpVKujMRg==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-is": "^5.0.0",
+ "unist-util-visit-parents": "^5.1.1"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/remark-rehype/node_modules/unist-util-visit-parents": {
+ "version": "5.1.3",
+ "resolved": "https://registry.npmjs.org/unist-util-visit-parents/-/unist-util-visit-parents-5.1.3.tgz",
+ "integrity": "sha512-x6+y8g7wWMyQhL1iZfhIPhDAs7Xwbn9nRosDXl7qoPTSCy0yNxnKc+hWokFifWQIDGi154rdUqKvbCa4+1kLhg==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-is": "^5.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/remark-rehype/node_modules/vfile": {
+ "version": "5.3.7",
+ "resolved": "https://registry.npmjs.org/vfile/-/vfile-5.3.7.tgz",
+ "integrity": "sha512-r7qlzkgErKjobAmyNIkkSpizsFPYiUPuJb5pNW1RB4JcYVZhs4lIbVqk8XPk033CV/1z8ss5pkax8SuhGpcG8g==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "is-buffer": "^2.0.0",
+ "unist-util-stringify-position": "^3.0.0",
+ "vfile-message": "^3.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/remark-rehype/node_modules/vfile-message": {
+ "version": "3.1.4",
+ "resolved": "https://registry.npmjs.org/vfile-message/-/vfile-message-3.1.4.tgz",
+ "integrity": "sha512-fa0Z6P8HUrQN4BZaX05SIVXic+7kE3b05PWAtPuYP9QLHsLKYR7/AlLW3NtOrpXRLeawpDLMsVkmk5DG0NXgWw==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-stringify-position": "^3.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
"node_modules/remark-squeeze-paragraphs": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/remark-squeeze-paragraphs/-/remark-squeeze-paragraphs-4.0.0.tgz",
@@ -15445,6 +17425,28 @@
"url": "https://github.com/chalk/strip-ansi?sponsor=1"
}
},
+ "node_modules/stringify-entities": {
+ "version": "4.0.3",
+ "resolved": "https://registry.npmjs.org/stringify-entities/-/stringify-entities-4.0.3.tgz",
+ "integrity": "sha512-BP9nNHMhhfcMbiuQKCqMjhDP5yBCAxsPu4pHFFzJ6Alo9dZgY4VLDPutXqIjpRiMoKdp7Av85Gr73Q5uH9k7+g==",
+ "dependencies": {
+ "character-entities-html4": "^2.0.0",
+ "character-entities-legacy": "^3.0.0"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/stringify-entities/node_modules/character-entities-legacy": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/character-entities-legacy/-/character-entities-legacy-3.0.0.tgz",
+ "integrity": "sha512-RpPp0asT/6ufRm//AJVwpViZbGM/MkjQFxJccQRHmISF/22NBtsHqAWmL+/pmkPWoIUJdWyeVleTl1wydHATVQ==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
"node_modules/stringify-object": {
"version": "3.3.0",
"resolved": "https://registry.npmjs.org/stringify-object/-/stringify-object-3.3.0.tgz",
@@ -16033,6 +18035,15 @@
"integrity": "sha512-YzQV+TZg4AxpKxaTHK3c3D+kRDCGVEE7LemdlQZoQXn0iennk10RsIoY6ikzAqJTc9Xjl9C1/waHom/J86ziAQ==",
"deprecated": "Use String.prototype.trim() instead"
},
+ "node_modules/trim-lines": {
+ "version": "3.0.1",
+ "resolved": "https://registry.npmjs.org/trim-lines/-/trim-lines-3.0.1.tgz",
+ "integrity": "sha512-kRj8B+YHZCc9kQYdWfJB2/oUl9rA99qbowYYBtr4ui4mZyAQ2JpvVBd/6U2YloATfqBhBTSMhTpgBHtU0Mf3Rg==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
"node_modules/trim-newlines": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/trim-newlines/-/trim-newlines-3.0.1.tgz",
@@ -16441,6 +18452,18 @@
"url": "https://opencollective.com/unified"
}
},
+ "node_modules/unist-util-position-from-estree": {
+ "version": "1.1.2",
+ "resolved": "https://registry.npmjs.org/unist-util-position-from-estree/-/unist-util-position-from-estree-1.1.2.tgz",
+ "integrity": "sha512-poZa0eXpS+/XpoQwGwl79UUdea4ol2ZuCYguVaJS4qzIOMDzbqz8a3erUCOmubSZkaOuGamb3tX790iwOIROww==",
+ "dependencies": {
+ "@types/unist": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
"node_modules/unist-util-remove": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/unist-util-remove/-/unist-util-remove-2.1.0.tgz",
diff --git a/docs/package.json b/docs/package.json
index c7732b3dc..856e66ebe 100644
--- a/docs/package.json
+++ b/docs/package.json
@@ -15,12 +15,13 @@
},
"dependencies": {
"@babel/preset-react": "^7.22.3",
+ "@code-hike/mdx": "^0.9.0",
"@docusaurus/core": "2.4.1",
"@docusaurus/plugin-ideal-image": "^2.4.1",
"@docusaurus/preset-classic": "2.4.1",
"@docusaurus/theme-classic": "^2.4.1",
"@docusaurus/theme-search-algolia": "^2.4.1",
- "@mdx-js/react": "^1.6.22",
+ "@mdx-js/react": "^2.3.0",
"@mendable/search": "^0.0.114",
"@pbe/react-yandex-maps": "^1.2.4",
"@prismicio/client": "^7.0.1",
@@ -28,6 +29,7 @@
"autoprefixer": "^10.4.14",
"clsx": "^1.2.1",
"docusaurus-plugin-image-zoom": "^0.1.4",
+ "docusaurus-theme-mdx-v2": "^0.1.2",
"jquery": "^3.7.0",
"medium-zoom": "^1.0.8",
"node-fetch": "^3.3.1",
@@ -67,4 +69,4 @@
"engines": {
"node": ">=16.14"
}
-}
\ No newline at end of file
+}
diff --git a/docs/sidebars.js b/docs/sidebars.js
index 01a84cf33..fbabab150 100644
--- a/docs/sidebars.js
+++ b/docs/sidebars.js
@@ -21,6 +21,8 @@ module.exports = {
"guidelines/collection",
"guidelines/prompt-customization",
"guidelines/chat-interface",
+ "guidelines/chat-widget",
+ "guidelines/custom-component",
],
},
{
@@ -30,11 +32,13 @@ module.exports = {
items: [
"components/agents",
"components/chains",
+ "components/custom",
"components/embeddings",
"components/llms",
"components/loaders",
"components/memories",
"components/prompts",
+ "components/retrievers",
"components/text-splitters",
"components/toolkits",
"components/tools",
@@ -63,6 +67,7 @@ module.exports = {
label: "Examples",
collapsed: false,
items: [
+ "examples/flow-runner",
"examples/conversation-chain",
"examples/buffer-memory",
"examples/midjourney-prompt-chain",
diff --git a/docs/src/css/custom.css b/docs/src/css/custom.css
index 2f6f992f3..b79c4df59 100644
--- a/docs/src/css/custom.css
+++ b/docs/src/css/custom.css
@@ -3,17 +3,19 @@
* bundles Infima by default. Infima is a CSS framework designed to
* work well for content-centric websites.
*/
- :root {
+:root {
--ifm-background-color: var(--token-primary-bg-c);
--ifm-navbar-link-hover-color: initial;
--ifm-navbar-padding-vertical: 0;
--ifm-navbar-item-padding-vertical: 0;
- --ifm-font-family-base: -apple-system, BlinkMacSystemFont, Inter, Helvetica, Arial, sans-serif, 'Apple Color Emoji', 'Segoe UI emoji';
- --ifm-font-family-monospace: 'SFMono-Regular', 'Roboto Mono', Consolas, 'Liberation Mono', Menlo, Courier, monospace;
+ --ifm-font-family-base: -apple-system, BlinkMacSystemFont, Inter, Helvetica,
+ Arial, sans-serif, "Apple Color Emoji", "Segoe UI emoji";
+ --ifm-font-family-monospace: "SFMono-Regular", "Roboto Mono", Consolas,
+ "Liberation Mono", Menlo, Courier, monospace;
}
.theme-doc-sidebar-item-category.menu__list-item:not(:first-child) {
- margin-top: 1.5rem!important;
+ margin-top: 1.5rem !important;
}
.docusaurus-highlight-code-line {
@@ -31,7 +33,7 @@
transform: skewY(6deg);
}
-[class^='announcementBar'] {
+[class^="announcementBar"] {
z-index: 10;
}
@@ -112,7 +114,7 @@ body {
}
.header-github-link:before {
- content: '';
+ content: "";
width: 24px;
height: 24px;
display: flex;
@@ -126,7 +128,7 @@ body {
}
.header-twitter-link::before {
- content: '';
+ content: "";
width: 24px;
height: 24px;
display: flex;
@@ -140,7 +142,7 @@ body {
}
.header-discord-link::before {
- content: '';
+ content: "";
width: 24px;
height: 24px;
display: flex;
@@ -148,7 +150,6 @@ body {
background-size: contain;
}
-
/* Images */
.image-rendering-crisp {
image-rendering: crisp-edges;
@@ -164,7 +165,7 @@ body {
.img-center {
display: flex;
justify-content: center;
- width: 100%,
+ width: 100%;
}
.resized-image {
@@ -188,4 +189,22 @@ body {
.mendable-search {
width: 140px;
}
-}
\ No newline at end of file
+}
+/*
+.ch-scrollycoding {
+ gap: 10rem !important;
+} */
+
+.ch-scrollycoding-content {
+ max-width: 55% !important;
+ min-width: 40% !important;
+}
+
+.ch-scrollycoding-sticker {
+ max-width: 60% !important;
+ min-width: 45% !important;
+}
+
+.ch-scrollycoding-step-content {
+ min-height: 70px;
+}
diff --git a/docs/src/theme/ZoomableImage.js b/docs/src/theme/ZoomableImage.js
index 750066bb7..aeeb0454a 100644
--- a/docs/src/theme/ZoomableImage.js
+++ b/docs/src/theme/ZoomableImage.js
@@ -1,8 +1,9 @@
-import React, { useState, useEffect } from 'react';
-import ThemedImage from '@theme/ThemedImage';
-import useBaseUrl from '@docusaurus/useBaseUrl';
+import React, { useState, useEffect } from "react";
+import ThemedImage from "@theme/ThemedImage";
+import useBaseUrl from "@docusaurus/useBaseUrl";
-const ZoomableImage = ({ alt, sources }) => {
+const ZoomableImage = ({ alt, sources, style }) => {
+ // add style here
const [isFullscreen, setIsFullscreen] = useState(false);
const toggleFullscreen = () => {
@@ -10,27 +11,36 @@ const ZoomableImage = ({ alt, sources }) => {
};
const handleKeyPress = (event) => {
- if (event.key === 'Escape') {
+ if (event.key === "Escape") {
setIsFullscreen(false);
}
};
useEffect(() => {
if (isFullscreen) {
- document.addEventListener('keydown', handleKeyPress);
+ document.addEventListener("keydown", handleKeyPress);
} else {
- document.removeEventListener('keydown', handleKeyPress);
+ document.removeEventListener("keydown", handleKeyPress);
}
return () => {
- document.removeEventListener('keydown', handleKeyPress);
+ document.removeEventListener("keydown", handleKeyPress);
};
}, [isFullscreen]);
+ // Default style
+ const defaultStyle = {
+ width: "50%",
+ margin: "0 auto",
+ display: "flex",
+ justifyContent: "center",
+ };
+
return (
=3.7,<4.0"
+python-versions = ">=3.7"
files = [
- {file = "aiofiles-23.1.0-py3-none-any.whl", hash = "sha256:9312414ae06472eb6f1d163f555e466a23aed1c8f60c30cccf7121dba2e53eb2"},
- {file = "aiofiles-23.1.0.tar.gz", hash = "sha256:edd247df9a19e0db16534d4baaf536d6609a43e1de5401d7a4c1c148753a1635"},
+ {file = "aiofiles-23.2.1-py3-none-any.whl", hash = "sha256:19297512c647d4b27a2cf7c34caa7e405c0d60b5560618a29a9fe027b18b0107"},
+ {file = "aiofiles-23.2.1.tar.gz", hash = "sha256:84ec2218d8419404abcb9f0c02df3f34c6e0a68ed41072acfb1cef5cbc29051a"},
]
[[package]]
name = "aiohttp"
version = "3.8.5"
description = "Async http client/server framework (asyncio)"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -125,7 +123,6 @@ speedups = ["Brotli", "aiodns", "cchardet"]
name = "aiosignal"
version = "1.3.1"
description = "aiosignal: a list of registered asynchronous callbacks"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -140,7 +137,6 @@ frozenlist = ">=1.1.0"
name = "aiostream"
version = "0.4.5"
description = "Generator-based operators for asynchronous iteration"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -148,16 +144,34 @@ files = [
{file = "aiostream-0.4.5.tar.gz", hash = "sha256:3ecbf87085230fbcd9605c32ca20c4fb41af02c71d076eab246ea22e35947d88"},
]
+[[package]]
+name = "alembic"
+version = "1.11.2"
+description = "A database migration tool for SQLAlchemy."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "alembic-1.11.2-py3-none-any.whl", hash = "sha256:7981ab0c4fad4fe1be0cf183aae17689fe394ff874fd2464adb774396faf0796"},
+ {file = "alembic-1.11.2.tar.gz", hash = "sha256:678f662130dc540dac12de0ea73de9f89caea9dbea138f60ef6263149bf84657"},
+]
+
+[package.dependencies]
+Mako = "*"
+SQLAlchemy = ">=1.3.0"
+typing-extensions = ">=4"
+
+[package.extras]
+tz = ["python-dateutil"]
+
[[package]]
name = "anthropic"
-version = "0.3.6"
+version = "0.3.8"
description = "Client library for the anthropic API"
-category = "main"
optional = false
python-versions = ">=3.7,<4.0"
files = [
- {file = "anthropic-0.3.6-py3-none-any.whl", hash = "sha256:45036a96f38598be82237c12d77d7aefe814a3bceb9da0bc6721a381c29821b1"},
- {file = "anthropic-0.3.6.tar.gz", hash = "sha256:6e644c84ad9375dc12e07b36aab1862ca4db98eb1750e08acfe4847e62afe0dd"},
+ {file = "anthropic-0.3.8-py3-none-any.whl", hash = "sha256:97ffe1bacc4214dc89b19f496cf2769746971e86f7c835a05aa21b76f260d279"},
+ {file = "anthropic-0.3.8.tar.gz", hash = "sha256:6651099807456c3b95b3879f5ad7d00f7e7e4f7649a2394d18032ab8be54ef16"},
]
[package.dependencies]
@@ -172,7 +186,6 @@ typing-extensions = ">=4.1.1,<5"
name = "anyio"
version = "3.7.1"
description = "High level compatibility layer for multiple asynchronous event loop implementations"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -194,7 +207,6 @@ trio = ["trio (<0.22)"]
name = "appdirs"
version = "1.4.4"
description = "A small Python module for determining appropriate platform-specific dirs, e.g. a \"user data dir\"."
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -206,7 +218,6 @@ files = [
name = "appnope"
version = "0.1.3"
description = "Disable App Nap on macOS >= 10.9"
-category = "dev"
optional = false
python-versions = "*"
files = [
@@ -218,7 +229,6 @@ files = [
name = "argilla"
version = "0.0.1"
description = ""
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -230,7 +240,6 @@ files = [
name = "asgiref"
version = "3.7.2"
description = "ASGI specs, helper code, and adapters"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -248,7 +257,6 @@ tests = ["mypy (>=0.800)", "pytest", "pytest-asyncio"]
name = "asttokens"
version = "2.2.1"
description = "Annotate AST trees with source code positions"
-category = "dev"
optional = false
python-versions = "*"
files = [
@@ -266,7 +274,6 @@ test = ["astroid", "pytest"]
name = "async-timeout"
version = "4.0.2"
description = "Timeout context manager for asyncio programs"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -278,7 +285,6 @@ files = [
name = "attrs"
version = "23.1.0"
description = "Classes Without Boilerplate"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -297,7 +303,6 @@ tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pyte
name = "authlib"
version = "1.2.1"
description = "The ultimate Python library in building OAuth and OpenID Connect servers and clients."
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -312,7 +317,6 @@ cryptography = ">=3.2"
name = "backcall"
version = "0.2.0"
description = "Specifications for callback functions passed in to an API"
-category = "dev"
optional = false
python-versions = "*"
files = [
@@ -324,7 +328,6 @@ files = [
name = "backoff"
version = "2.2.1"
description = "Function decoration for backoff and retry"
-category = "main"
optional = false
python-versions = ">=3.7,<4.0"
files = [
@@ -336,7 +339,6 @@ files = [
name = "beautifulsoup4"
version = "4.12.2"
description = "Screen-scraping library"
-category = "main"
optional = false
python-versions = ">=3.6.0"
files = [
@@ -355,7 +357,6 @@ lxml = ["lxml"]
name = "black"
version = "23.7.0"
description = "The uncompromising code formatter."
-category = "dev"
optional = false
python-versions = ">=3.8"
files = [
@@ -402,7 +403,6 @@ uvloop = ["uvloop (>=0.15.2)"]
name = "bleach"
version = "6.0.0"
description = "An easy safelist-based HTML-sanitizing tool."
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -421,7 +421,6 @@ css = ["tinycss2 (>=1.1.0,<1.2)"]
name = "cachetools"
version = "5.3.1"
description = "Extensible memoizing collections and decorators"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -433,7 +432,6 @@ files = [
name = "certifi"
version = "2023.7.22"
description = "Python package for providing Mozilla's CA Bundle."
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -445,7 +443,6 @@ files = [
name = "cffi"
version = "1.15.1"
description = "Foreign Function Interface for Python calling C code."
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -520,21 +517,19 @@ pycparser = "*"
[[package]]
name = "chardet"
-version = "5.1.0"
+version = "5.2.0"
description = "Universal encoding detector for Python 3"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "chardet-5.1.0-py3-none-any.whl", hash = "sha256:362777fb014af596ad31334fde1e8c327dfdb076e1960d1694662d46a6917ab9"},
- {file = "chardet-5.1.0.tar.gz", hash = "sha256:0d62712b956bc154f85fb0a266e2a3c5913c2967e00348701b32411d6def31e5"},
+ {file = "chardet-5.2.0-py3-none-any.whl", hash = "sha256:e1cf59446890a00105fe7b7912492ea04b6e6f06d4b742b2c788469e34c82970"},
+ {file = "chardet-5.2.0.tar.gz", hash = "sha256:1b3b6ff479a8c414bc3fa2c0852995695c4a026dcd6d0633b2dd092ca39c1cf7"},
]
[[package]]
name = "charset-normalizer"
version = "3.2.0"
description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
-category = "main"
optional = false
python-versions = ">=3.7.0"
files = [
@@ -619,7 +614,6 @@ files = [
name = "chromadb"
version = "0.3.26"
description = "Chroma."
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -649,7 +643,6 @@ uvicorn = {version = ">=0.18.3", extras = ["standard"]}
name = "click"
version = "8.1.6"
description = "Composable command line interface toolkit"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -664,7 +657,6 @@ colorama = {version = "*", markers = "platform_system == \"Windows\""}
name = "click-log"
version = "0.4.0"
description = "Logging integration for Click"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -679,7 +671,6 @@ click = "*"
name = "clickhouse-connect"
version = "0.6.8"
description = "ClickHouse Database Core Driver for Python, Pandas, and Superset"
-category = "main"
optional = false
python-versions = "~=3.7"
files = [
@@ -767,19 +758,19 @@ sqlalchemy = ["sqlalchemy (>1.3.21,<2.0)"]
[[package]]
name = "cohere"
-version = "4.17.0"
+version = "4.19.2"
description = ""
-category = "main"
optional = false
python-versions = ">=3.7,<4.0"
files = [
- {file = "cohere-4.17.0-py3-none-any.whl", hash = "sha256:44e0bdb0a2d9467506d27b285f542177b98f92647f27e17ea921a01006fe2f33"},
- {file = "cohere-4.17.0.tar.gz", hash = "sha256:9f479543b50490b4cb6385468d7571ad891a09cde7bd6b028171596bac6ce6ff"},
+ {file = "cohere-4.19.2-py3-none-any.whl", hash = "sha256:0b6a4fe04380a481a8e975ebcc9bb6433febe4d3eb583b6d6e04342a5e998345"},
+ {file = "cohere-4.19.2.tar.gz", hash = "sha256:a0b0fa698b3d3983fb328bb90d68fcf08faaa2268f3772ebc6bfea6ba55acf27"},
]
[package.dependencies]
aiohttp = ">=3.0,<4.0"
backoff = ">=2.0,<3.0"
+fastavro = {version = "1.8.2", markers = "python_version >= \"3.8\""}
importlib_metadata = ">=6.0,<7.0"
requests = ">=2.25.0,<3.0.0"
urllib3 = ">=1.26,<3"
@@ -788,7 +779,6 @@ urllib3 = ">=1.26,<3"
name = "colorama"
version = "0.4.6"
description = "Cross-platform colored terminal text."
-category = "main"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7"
files = [
@@ -800,7 +790,6 @@ files = [
name = "coloredlogs"
version = "15.0.1"
description = "Colored terminal output for Python's logging module"
-category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
files = [
@@ -816,18 +805,17 @@ cron = ["capturer (>=2.4)"]
[[package]]
name = "comm"
-version = "0.1.3"
+version = "0.1.4"
description = "Jupyter Python Comm implementation, for usage in ipykernel, xeus-python etc."
-category = "dev"
optional = false
python-versions = ">=3.6"
files = [
- {file = "comm-0.1.3-py3-none-any.whl", hash = "sha256:16613c6211e20223f215fc6d3b266a247b6e2641bf4e0a3ad34cb1aff2aa3f37"},
- {file = "comm-0.1.3.tar.gz", hash = "sha256:a61efa9daffcfbe66fd643ba966f846a624e4e6d6767eda9cf6e993aadaab93e"},
+ {file = "comm-0.1.4-py3-none-any.whl", hash = "sha256:6d52794cba11b36ed9860999cd10fd02d6b2eac177068fdd585e1e2f8a96e67a"},
+ {file = "comm-0.1.4.tar.gz", hash = "sha256:354e40a59c9dd6db50c5cc6b4acc887d82e9603787f83b68c01a80a923984d15"},
]
[package.dependencies]
-traitlets = ">=5.3"
+traitlets = ">=4"
[package.extras]
lint = ["black (>=22.6.0)", "mdformat (>0.7)", "mdformat-gfm (>=0.3.5)", "ruff (>=0.0.156)"]
@@ -838,7 +826,6 @@ typing = ["mypy (>=0.990)"]
name = "coverage"
version = "7.2.7"
description = "Code coverage measurement for Python"
-category = "dev"
optional = false
python-versions = ">=3.7"
files = [
@@ -912,35 +899,34 @@ toml = ["tomli"]
[[package]]
name = "cryptography"
-version = "41.0.2"
+version = "41.0.3"
description = "cryptography is a package which provides cryptographic recipes and primitives to Python developers."
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "cryptography-41.0.2-cp37-abi3-macosx_10_12_universal2.whl", hash = "sha256:01f1d9e537f9a15b037d5d9ee442b8c22e3ae11ce65ea1f3316a41c78756b711"},
- {file = "cryptography-41.0.2-cp37-abi3-macosx_10_12_x86_64.whl", hash = "sha256:079347de771f9282fbfe0e0236c716686950c19dee1b76240ab09ce1624d76d7"},
- {file = "cryptography-41.0.2-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:439c3cc4c0d42fa999b83ded80a9a1fb54d53c58d6e59234cfe97f241e6c781d"},
- {file = "cryptography-41.0.2-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f14ad275364c8b4e525d018f6716537ae7b6d369c094805cae45300847e0894f"},
- {file = "cryptography-41.0.2-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:84609ade00a6ec59a89729e87a503c6e36af98ddcd566d5f3be52e29ba993182"},
- {file = "cryptography-41.0.2-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:49c3222bb8f8e800aead2e376cbef687bc9e3cb9b58b29a261210456a7783d83"},
- {file = "cryptography-41.0.2-cp37-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:d73f419a56d74fef257955f51b18d046f3506270a5fd2ac5febbfa259d6c0fa5"},
- {file = "cryptography-41.0.2-cp37-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:2a034bf7d9ca894720f2ec1d8b7b5832d7e363571828037f9e0c4f18c1b58a58"},
- {file = "cryptography-41.0.2-cp37-abi3-win32.whl", hash = "sha256:d124682c7a23c9764e54ca9ab5b308b14b18eba02722b8659fb238546de83a76"},
- {file = "cryptography-41.0.2-cp37-abi3-win_amd64.whl", hash = "sha256:9c3fe6534d59d071ee82081ca3d71eed3210f76ebd0361798c74abc2bcf347d4"},
- {file = "cryptography-41.0.2-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:a719399b99377b218dac6cf547b6ec54e6ef20207b6165126a280b0ce97e0d2a"},
- {file = "cryptography-41.0.2-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:182be4171f9332b6741ee818ec27daff9fb00349f706629f5cbf417bd50e66fd"},
- {file = "cryptography-41.0.2-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:7a9a3bced53b7f09da251685224d6a260c3cb291768f54954e28f03ef14e3766"},
- {file = "cryptography-41.0.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:f0dc40e6f7aa37af01aba07277d3d64d5a03dc66d682097541ec4da03cc140ee"},
- {file = "cryptography-41.0.2-pp38-pypy38_pp73-macosx_10_12_x86_64.whl", hash = "sha256:674b669d5daa64206c38e507808aae49904c988fa0a71c935e7006a3e1e83831"},
- {file = "cryptography-41.0.2-pp38-pypy38_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:7af244b012711a26196450d34f483357e42aeddb04128885d95a69bd8b14b69b"},
- {file = "cryptography-41.0.2-pp38-pypy38_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:9b6d717393dbae53d4e52684ef4f022444fc1cce3c48c38cb74fca29e1f08eaa"},
- {file = "cryptography-41.0.2-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:192255f539d7a89f2102d07d7375b1e0a81f7478925b3bc2e0549ebf739dae0e"},
- {file = "cryptography-41.0.2-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:f772610fe364372de33d76edcd313636a25684edb94cee53fd790195f5989d14"},
- {file = "cryptography-41.0.2-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:b332cba64d99a70c1e0836902720887fb4529ea49ea7f5462cf6640e095e11d2"},
- {file = "cryptography-41.0.2-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:9a6673c1828db6270b76b22cc696f40cde9043eb90373da5c2f8f2158957f42f"},
- {file = "cryptography-41.0.2-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:342f3767e25876751e14f8459ad85e77e660537ca0a066e10e75df9c9e9099f0"},
- {file = "cryptography-41.0.2.tar.gz", hash = "sha256:7d230bf856164de164ecb615ccc14c7fc6de6906ddd5b491f3af90d3514c925c"},
+ {file = "cryptography-41.0.3-cp37-abi3-macosx_10_12_universal2.whl", hash = "sha256:652627a055cb52a84f8c448185922241dd5217443ca194d5739b44612c5e6507"},
+ {file = "cryptography-41.0.3-cp37-abi3-macosx_10_12_x86_64.whl", hash = "sha256:8f09daa483aedea50d249ef98ed500569841d6498aa9c9f4b0531b9964658922"},
+ {file = "cryptography-41.0.3-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4fd871184321100fb400d759ad0cddddf284c4b696568204d281c902fc7b0d81"},
+ {file = "cryptography-41.0.3-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:84537453d57f55a50a5b6835622ee405816999a7113267739a1b4581f83535bd"},
+ {file = "cryptography-41.0.3-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:3fb248989b6363906827284cd20cca63bb1a757e0a2864d4c1682a985e3dca47"},
+ {file = "cryptography-41.0.3-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:42cb413e01a5d36da9929baa9d70ca90d90b969269e5a12d39c1e0d475010116"},
+ {file = "cryptography-41.0.3-cp37-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:aeb57c421b34af8f9fe830e1955bf493a86a7996cc1338fe41b30047d16e962c"},
+ {file = "cryptography-41.0.3-cp37-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:6af1c6387c531cd364b72c28daa29232162010d952ceb7e5ca8e2827526aceae"},
+ {file = "cryptography-41.0.3-cp37-abi3-win32.whl", hash = "sha256:0d09fb5356f975974dbcb595ad2d178305e5050656affb7890a1583f5e02a306"},
+ {file = "cryptography-41.0.3-cp37-abi3-win_amd64.whl", hash = "sha256:a983e441a00a9d57a4d7c91b3116a37ae602907a7618b882c8013b5762e80574"},
+ {file = "cryptography-41.0.3-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5259cb659aa43005eb55a0e4ff2c825ca111a0da1814202c64d28a985d33b087"},
+ {file = "cryptography-41.0.3-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:67e120e9a577c64fe1f611e53b30b3e69744e5910ff3b6e97e935aeb96005858"},
+ {file = "cryptography-41.0.3-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:7efe8041897fe7a50863e51b77789b657a133c75c3b094e51b5e4b5cec7bf906"},
+ {file = "cryptography-41.0.3-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:ce785cf81a7bdade534297ef9e490ddff800d956625020ab2ec2780a556c313e"},
+ {file = "cryptography-41.0.3-pp38-pypy38_pp73-macosx_10_12_x86_64.whl", hash = "sha256:57a51b89f954f216a81c9d057bf1a24e2f36e764a1ca9a501a6964eb4a6800dd"},
+ {file = "cryptography-41.0.3-pp38-pypy38_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:4c2f0d35703d61002a2bbdcf15548ebb701cfdd83cdc12471d2bae80878a4207"},
+ {file = "cryptography-41.0.3-pp38-pypy38_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:23c2d778cf829f7d0ae180600b17e9fceea3c2ef8b31a99e3c694cbbf3a24b84"},
+ {file = "cryptography-41.0.3-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:95dd7f261bb76948b52a5330ba5202b91a26fbac13ad0e9fc8a3ac04752058c7"},
+ {file = "cryptography-41.0.3-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:41d7aa7cdfded09b3d73a47f429c298e80796c8e825ddfadc84c8a7f12df212d"},
+ {file = "cryptography-41.0.3-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:d0d651aa754ef58d75cec6edfbd21259d93810b73f6ec246436a21b7841908de"},
+ {file = "cryptography-41.0.3-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:ab8de0d091acbf778f74286f4989cf3d1528336af1b59f3e5d2ebca8b5fe49e1"},
+ {file = "cryptography-41.0.3-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:a74fbcdb2a0d46fe00504f571a2a540532f4c188e6ccf26f1f178480117b33c4"},
+ {file = "cryptography-41.0.3.tar.gz", hash = "sha256:6d192741113ef5e30d89dcb5b956ef4e1578f304708701b8b73d38e3e1461f34"},
]
[package.dependencies]
@@ -958,32 +944,33 @@ test-randomorder = ["pytest-randomly"]
[[package]]
name = "ctransformers"
-version = "0.2.14"
+version = "0.2.21"
description = "Python bindings for the Transformer models implemented in C/C++ using GGML library."
-category = "main"
-optional = false
+optional = true
python-versions = "*"
files = [
- {file = "ctransformers-0.2.14-py3-none-any.whl", hash = "sha256:80064fd1b724a4b6020e1f7ee6b54eb5004eb52a1c81aef734565c6573962190"},
- {file = "ctransformers-0.2.14.tar.gz", hash = "sha256:0b58ad4da874a363ae2eb89d2cdb1956108d9c3b894cb16787001111a2075328"},
+ {file = "ctransformers-0.2.21-py3-none-any.whl", hash = "sha256:18a0555d02f55a3935f5544b885038562f80e497a6197d8e871941a087dba546"},
+ {file = "ctransformers-0.2.21.tar.gz", hash = "sha256:58e7a699050a106688b967faa59f377886e22a581fde6cd36821dfa541995677"},
]
[package.dependencies]
huggingface-hub = "*"
+py-cpuinfo = ">=9.0.0,<10.0.0"
[package.extras]
+cuda = ["nvidia-cublas-cu12", "nvidia-cuda-runtime-cu12"]
+gptq = ["exllama (==0.1.0)"]
tests = ["pytest"]
[[package]]
name = "dataclasses-json"
-version = "0.5.13"
+version = "0.5.14"
description = "Easily serialize dataclasses to and from JSON."
-category = "main"
optional = false
-python-versions = ">=3.7,<3.12"
+python-versions = ">=3.7,<3.13"
files = [
- {file = "dataclasses_json-0.5.13-py3-none-any.whl", hash = "sha256:97b13447f2e0b96aa6e52509040c12d70c61df8a972f3feb5cc89a6da5e177bd"},
- {file = "dataclasses_json-0.5.13.tar.gz", hash = "sha256:425810e1356fb6917eb7c323e3aaee0c9398fc55b5001d3532381679f727fc18"},
+ {file = "dataclasses_json-0.5.14-py3-none-any.whl", hash = "sha256:5ec6fed642adb1dbdb4182badb01e0861badfd8fda82e3b67f44b2d1e9d10d21"},
+ {file = "dataclasses_json-0.5.14.tar.gz", hash = "sha256:d82896a94c992ffaf689cd1fafc180164e2abdd415b8f94a7f78586af5886236"},
]
[package.dependencies]
@@ -992,37 +979,35 @@ typing-inspect = ">=0.4.0,<1"
[[package]]
name = "debugpy"
-version = "1.6.7"
+version = "1.6.7.post1"
description = "An implementation of the Debug Adapter Protocol for Python"
-category = "dev"
optional = false
python-versions = ">=3.7"
files = [
- {file = "debugpy-1.6.7-cp310-cp310-macosx_11_0_x86_64.whl", hash = "sha256:b3e7ac809b991006ad7f857f016fa92014445085711ef111fdc3f74f66144096"},
- {file = "debugpy-1.6.7-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e3876611d114a18aafef6383695dfc3f1217c98a9168c1aaf1a02b01ec7d8d1e"},
- {file = "debugpy-1.6.7-cp310-cp310-win32.whl", hash = "sha256:33edb4afa85c098c24cc361d72ba7c21bb92f501104514d4ffec1fb36e09c01a"},
- {file = "debugpy-1.6.7-cp310-cp310-win_amd64.whl", hash = "sha256:ed6d5413474e209ba50b1a75b2d9eecf64d41e6e4501977991cdc755dc83ab0f"},
- {file = "debugpy-1.6.7-cp37-cp37m-macosx_10_15_x86_64.whl", hash = "sha256:38ed626353e7c63f4b11efad659be04c23de2b0d15efff77b60e4740ea685d07"},
- {file = "debugpy-1.6.7-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:279d64c408c60431c8ee832dfd9ace7c396984fd7341fa3116aee414e7dcd88d"},
- {file = "debugpy-1.6.7-cp37-cp37m-win32.whl", hash = "sha256:dbe04e7568aa69361a5b4c47b4493d5680bfa3a911d1e105fbea1b1f23f3eb45"},
- {file = "debugpy-1.6.7-cp37-cp37m-win_amd64.whl", hash = "sha256:f90a2d4ad9a035cee7331c06a4cf2245e38bd7c89554fe3b616d90ab8aab89cc"},
- {file = "debugpy-1.6.7-cp38-cp38-macosx_10_15_x86_64.whl", hash = "sha256:5224eabbbeddcf1943d4e2821876f3e5d7d383f27390b82da5d9558fd4eb30a9"},
- {file = "debugpy-1.6.7-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bae1123dff5bfe548ba1683eb972329ba6d646c3a80e6b4c06cd1b1dd0205e9b"},
- {file = "debugpy-1.6.7-cp38-cp38-win32.whl", hash = "sha256:9cd10cf338e0907fdcf9eac9087faa30f150ef5445af5a545d307055141dd7a4"},
- {file = "debugpy-1.6.7-cp38-cp38-win_amd64.whl", hash = "sha256:aaf6da50377ff4056c8ed470da24632b42e4087bc826845daad7af211e00faad"},
- {file = "debugpy-1.6.7-cp39-cp39-macosx_11_0_x86_64.whl", hash = "sha256:0679b7e1e3523bd7d7869447ec67b59728675aadfc038550a63a362b63029d2c"},
- {file = "debugpy-1.6.7-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:de86029696e1b3b4d0d49076b9eba606c226e33ae312a57a46dca14ff370894d"},
- {file = "debugpy-1.6.7-cp39-cp39-win32.whl", hash = "sha256:d71b31117779d9a90b745720c0eab54ae1da76d5b38c8026c654f4a066b0130a"},
- {file = "debugpy-1.6.7-cp39-cp39-win_amd64.whl", hash = "sha256:c0ff93ae90a03b06d85b2c529eca51ab15457868a377c4cc40a23ab0e4e552a3"},
- {file = "debugpy-1.6.7-py2.py3-none-any.whl", hash = "sha256:53f7a456bc50706a0eaabecf2d3ce44c4d5010e46dfc65b6b81a518b42866267"},
- {file = "debugpy-1.6.7.zip", hash = "sha256:c4c2f0810fa25323abfdfa36cbbbb24e5c3b1a42cb762782de64439c575d67f2"},
+ {file = "debugpy-1.6.7.post1-cp310-cp310-macosx_11_0_x86_64.whl", hash = "sha256:903bd61d5eb433b6c25b48eae5e23821d4c1a19e25c9610205f5aeaccae64e32"},
+ {file = "debugpy-1.6.7.post1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d16882030860081e7dd5aa619f30dec3c2f9a421e69861125f83cc372c94e57d"},
+ {file = "debugpy-1.6.7.post1-cp310-cp310-win32.whl", hash = "sha256:eea8d8cfb9965ac41b99a61f8e755a8f50e9a20330938ad8271530210f54e09c"},
+ {file = "debugpy-1.6.7.post1-cp310-cp310-win_amd64.whl", hash = "sha256:85969d864c45f70c3996067cfa76a319bae749b04171f2cdeceebe4add316155"},
+ {file = "debugpy-1.6.7.post1-cp37-cp37m-macosx_11_0_x86_64.whl", hash = "sha256:890f7ab9a683886a0f185786ffbda3b46495c4b929dab083b8c79d6825832a52"},
+ {file = "debugpy-1.6.7.post1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d4ac7a4dba28801d184b7fc0e024da2635ca87d8b0a825c6087bb5168e3c0d28"},
+ {file = "debugpy-1.6.7.post1-cp37-cp37m-win32.whl", hash = "sha256:3370ef1b9951d15799ef7af41f8174194f3482ee689988379763ef61a5456426"},
+ {file = "debugpy-1.6.7.post1-cp37-cp37m-win_amd64.whl", hash = "sha256:65b28435a17cba4c09e739621173ff90c515f7b9e8ea469b92e3c28ef8e5cdfb"},
+ {file = "debugpy-1.6.7.post1-cp38-cp38-macosx_11_0_x86_64.whl", hash = "sha256:92b6dae8bfbd497c90596bbb69089acf7954164aea3228a99d7e43e5267f5b36"},
+ {file = "debugpy-1.6.7.post1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:72f5d2ecead8125cf669e62784ef1e6300f4067b0f14d9f95ee00ae06fc7c4f7"},
+ {file = "debugpy-1.6.7.post1-cp38-cp38-win32.whl", hash = "sha256:f0851403030f3975d6e2eaa4abf73232ab90b98f041e3c09ba33be2beda43fcf"},
+ {file = "debugpy-1.6.7.post1-cp38-cp38-win_amd64.whl", hash = "sha256:3de5d0f97c425dc49bce4293df6a04494309eedadd2b52c22e58d95107e178d9"},
+ {file = "debugpy-1.6.7.post1-cp39-cp39-macosx_11_0_x86_64.whl", hash = "sha256:38651c3639a4e8bbf0ca7e52d799f6abd07d622a193c406be375da4d510d968d"},
+ {file = "debugpy-1.6.7.post1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:038c51268367c9c935905a90b1c2d2dbfe304037c27ba9d19fe7409f8cdc710c"},
+ {file = "debugpy-1.6.7.post1-cp39-cp39-win32.whl", hash = "sha256:4b9eba71c290852f959d2cf8a03af28afd3ca639ad374d393d53d367f7f685b2"},
+ {file = "debugpy-1.6.7.post1-cp39-cp39-win_amd64.whl", hash = "sha256:973a97ed3b434eab0f792719a484566c35328196540676685c975651266fccf9"},
+ {file = "debugpy-1.6.7.post1-py2.py3-none-any.whl", hash = "sha256:1093a5c541af079c13ac8c70ab8b24d1d35c8cacb676306cf11e57f699c02926"},
+ {file = "debugpy-1.6.7.post1.zip", hash = "sha256:fe87ec0182ef624855d05e6ed7e0b7cb1359d2ffa2a925f8ec2d22e98b75d0ca"},
]
[[package]]
name = "decorator"
version = "5.1.1"
description = "Decorators for Humans"
-category = "main"
optional = false
python-versions = ">=3.5"
files = [
@@ -1034,7 +1019,6 @@ files = [
name = "deprecated"
version = "1.2.14"
description = "Python @deprecated decorator to deprecate old python classes, functions or methods."
-category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
files = [
@@ -1052,7 +1036,6 @@ dev = ["PyTest", "PyTest-Cov", "bump2version (<1)", "sphinx (<2)", "tox"]
name = "deprecation"
version = "2.1.0"
description = "A library to handle automated deprecations"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -1067,7 +1050,6 @@ packaging = "*"
name = "dill"
version = "0.3.7"
description = "serialize all of Python"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -1082,8 +1064,7 @@ graph = ["objgraph (>=1.7.2)"]
name = "diskcache"
version = "5.6.1"
description = "Disk Cache -- Disk and file backed persistent cache."
-category = "main"
-optional = false
+optional = true
python-versions = ">=3"
files = [
{file = "diskcache-5.6.1-py3-none-any.whl", hash = "sha256:558c6a2d5d7c721bb00e40711803d6804850c9f76c426ed81ecc627fe9d2ce2d"},
@@ -1094,7 +1075,6 @@ files = [
name = "distro"
version = "1.8.0"
description = "Distro - an OS platform information API"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -1104,20 +1084,18 @@ files = [
[[package]]
name = "dnspython"
-version = "2.3.0"
+version = "2.4.2"
description = "DNS toolkit"
-category = "main"
optional = false
-python-versions = ">=3.7,<4.0"
+python-versions = ">=3.8,<4.0"
files = [
- {file = "dnspython-2.3.0-py3-none-any.whl", hash = "sha256:89141536394f909066cabd112e3e1a37e4e654db00a25308b0f130bc3152eb46"},
- {file = "dnspython-2.3.0.tar.gz", hash = "sha256:224e32b03eb46be70e12ef6d64e0be123a64e621ab4c0822ff6d450d52a540b9"},
+ {file = "dnspython-2.4.2-py3-none-any.whl", hash = "sha256:57c6fbaaeaaf39c891292012060beb141791735dbb4004798328fc2c467402d8"},
+ {file = "dnspython-2.4.2.tar.gz", hash = "sha256:8dcfae8c7460a2f84b4072e26f1c9f4101ca20c071649cb7c34e8b6a93d58984"},
]
[package.extras]
-curio = ["curio (>=1.2,<2.0)", "sniffio (>=1.1,<2.0)"]
-dnssec = ["cryptography (>=2.6,<40.0)"]
-doh = ["h2 (>=4.1.0)", "httpx (>=0.21.1)", "requests (>=2.23.0,<3.0.0)", "requests-toolbelt (>=0.9.1,<0.11.0)"]
+dnssec = ["cryptography (>=2.6,<42.0)"]
+doh = ["h2 (>=4.1.0)", "httpcore (>=0.17.3)", "httpx (>=0.24.1)"]
doq = ["aioquic (>=0.9.20)"]
idna = ["idna (>=2.1,<4.0)"]
trio = ["trio (>=0.14,<0.23)"]
@@ -1127,7 +1105,6 @@ wmi = ["wmi (>=1.5.1,<2.0.0)"]
name = "docarray"
version = "0.21.1"
description = "The data structure for unstructured data"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -1156,7 +1133,6 @@ weaviate = ["weaviate-client (>=3.9.0,<3.10.0)"]
name = "docker"
version = "6.1.3"
description = "A Python library for the Docker Engine API."
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -1178,7 +1154,6 @@ ssh = ["paramiko (>=2.4.3)"]
name = "docstring-parser"
version = "0.15"
description = "Parse Python docstrings in reST, Google and Numpydoc format"
-category = "main"
optional = false
python-versions = ">=3.6,<4.0"
files = [
@@ -1190,7 +1165,6 @@ files = [
name = "docutils"
version = "0.20.1"
description = "Docutils -- Python Documentation Utilities"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -1202,7 +1176,6 @@ files = [
name = "dotty-dict"
version = "1.3.1"
description = "Dictionary wrapper for quick access to deeply nested keys."
-category = "main"
optional = false
python-versions = ">=3.5,<4.0"
files = [
@@ -1214,7 +1187,6 @@ files = [
name = "duckdb"
version = "0.8.1"
description = "DuckDB embedded database"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -1276,7 +1248,6 @@ files = [
name = "ecdsa"
version = "0.18.0"
description = "ECDSA cryptographic signature library (pure python)"
-category = "main"
optional = false
python-versions = ">=2.6, !=3.0.*, !=3.1.*, !=3.2.*"
files = [
@@ -1295,7 +1266,6 @@ gmpy2 = ["gmpy2"]
name = "et-xmlfile"
version = "1.1.0"
description = "An implementation of lxml.xmlfile for the standard library"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -1307,7 +1277,6 @@ files = [
name = "exceptiongroup"
version = "1.1.2"
description = "Backport of PEP 654 (exception groups)"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -1322,7 +1291,6 @@ test = ["pytest (>=6)"]
name = "executing"
version = "1.2.0"
description = "Get the currently executing AST node of a frame, and other information"
-category = "dev"
optional = false
python-versions = "*"
files = [
@@ -1337,7 +1305,6 @@ tests = ["asttokens", "littleutils", "pytest", "rich"]
name = "faiss-cpu"
version = "1.7.4"
description = "A library for efficient similarity search and clustering of dense vectors."
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -1370,14 +1337,13 @@ files = [
[[package]]
name = "fake-useragent"
-version = "1.1.3"
+version = "1.2.1"
description = "Up-to-date simple useragent faker with real world database"
-category = "main"
optional = false
python-versions = "*"
files = [
- {file = "fake-useragent-1.1.3.tar.gz", hash = "sha256:1c06f0aa7d6e4894b919b30b9c7ebd72ff497325191057fbb5df3d5db06b93fc"},
- {file = "fake_useragent-1.1.3-py3-none-any.whl", hash = "sha256:695d3b1bf7d11d04ab0f971fb73b0ca8de98b78bbadfbc8bacbc9a48423f7531"},
+ {file = "fake-useragent-1.2.1.tar.gz", hash = "sha256:b411f903331f695e3840ccadcf011f745a405764e97c588f2b8fde9e400a5446"},
+ {file = "fake_useragent-1.2.1-py3-none-any.whl", hash = "sha256:ad2b5414d19493d0789572f04200d4f656f84d20b205cc805233212957fe385d"},
]
[package.dependencies]
@@ -1385,14 +1351,13 @@ importlib-resources = {version = ">=5.0", markers = "python_version < \"3.10\""}
[[package]]
name = "fastapi"
-version = "0.100.0"
+version = "0.100.1"
description = "FastAPI framework, high performance, easy to learn, fast to code, ready for production"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "fastapi-0.100.0-py3-none-any.whl", hash = "sha256:271662daf986da8fa98dc2b7c7f61c4abdfdccfb4786d79ed8b2878f172c6d5f"},
- {file = "fastapi-0.100.0.tar.gz", hash = "sha256:acb5f941ea8215663283c10018323ba7ea737c571b67fc7e88e9469c7eb1d12e"},
+ {file = "fastapi-0.100.1-py3-none-any.whl", hash = "sha256:ec6dd52bfc4eff3063cfcd0713b43c87640fefb2687bbbe3d8a08d94049cdf32"},
+ {file = "fastapi-0.100.1.tar.gz", hash = "sha256:522700d7a469e4a973d92321ab93312448fbe20fca9c8da97effc7e7bc56df23"},
]
[package.dependencies]
@@ -1403,11 +1368,50 @@ typing-extensions = ">=4.5.0"
[package.extras]
all = ["email-validator (>=2.0.0)", "httpx (>=0.23.0)", "itsdangerous (>=1.1.0)", "jinja2 (>=2.11.2)", "orjson (>=3.2.1)", "pydantic-extra-types (>=2.0.0)", "pydantic-settings (>=2.0.0)", "python-multipart (>=0.0.5)", "pyyaml (>=5.3.1)", "ujson (>=4.0.1,!=4.0.2,!=4.1.0,!=4.2.0,!=4.3.0,!=5.0.0,!=5.1.0)", "uvicorn[standard] (>=0.12.0)"]
+[[package]]
+name = "fastavro"
+version = "1.8.2"
+description = "Fast read/write of AVRO files"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "fastavro-1.8.2-cp310-cp310-macosx_11_0_x86_64.whl", hash = "sha256:0e08964b2e9a455d831f2557402a683d4c4d45206f2ab9ade7c69d3dc14e0e58"},
+ {file = "fastavro-1.8.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:401a70b1e5c7161420c6019e0c8afa88f7c8a373468591f5ec37639a903c2509"},
+ {file = "fastavro-1.8.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eef1ed3eaa4240c05698d02d8d0c010b9a03780eda37b492da6cd4c9d37e04ec"},
+ {file = "fastavro-1.8.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:543185a672ff6306beb329b57a7b8a3a2dd1eb21a5ccc530150623d58d48bb98"},
+ {file = "fastavro-1.8.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:ffbf8bae1edb50fe7beeffc3afa8e684686550c2e5d31bf01c25cfa213f581e1"},
+ {file = "fastavro-1.8.2-cp310-cp310-win_amd64.whl", hash = "sha256:bb545eb9d876bc7b785e27e98e7720ada7eee7d7a1729798d2ed51517f13500a"},
+ {file = "fastavro-1.8.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:2b837d3038c651046252bc92c1b9899bf21c7927a148a1ff89599c36c2a331ca"},
+ {file = "fastavro-1.8.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d3510e96c0a47e4e914bd1a29c954eb662bfa24849ad92e597cb97cc79f21af7"},
+ {file = "fastavro-1.8.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ccc0e74f2c2ab357f39bb73d67fcdb6dc10e23fdbbd399326139f72ec0fb99a3"},
+ {file = "fastavro-1.8.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:add51c70d0ab1175601c75cd687bbe9d16ae312cd8899b907aafe0d79ee2bc1d"},
+ {file = "fastavro-1.8.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d9e2662f57e6453e9a2c9fb4f54b2a9e62e3e46f5a412ac00558112336d23883"},
+ {file = "fastavro-1.8.2-cp311-cp311-win_amd64.whl", hash = "sha256:fea75cf53a93c56dd56e68abce8d314ef877b27451c870cd7ede7582d34c08a7"},
+ {file = "fastavro-1.8.2-cp38-cp38-macosx_11_0_x86_64.whl", hash = "sha256:f489020bb8664c2737c03457ad5dbd490579ddab6f0a7b5c17fecfe982715a89"},
+ {file = "fastavro-1.8.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a547625c138efd5e61300119241041906ee8cb426fc7aa789900f87af7ed330d"},
+ {file = "fastavro-1.8.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:53beb458f30c9ad4aa7bff4a42243ff990ffb713b6ce0cd9b360cbc3d648fe52"},
+ {file = "fastavro-1.8.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:7b1b2cbd2dd851452306beed0ab9bdaeeab1cc8ad46f84b47cd81eeaff6dd6b8"},
+ {file = "fastavro-1.8.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:d29e9baee0b2f37ecd09bde3b487cf900431fd548c85be3e4fe1b9a0b2a917f1"},
+ {file = "fastavro-1.8.2-cp38-cp38-win_amd64.whl", hash = "sha256:66e132c710663230292bc63e2cb79cf95b16ccb94a5fc99bb63694b24e312fc5"},
+ {file = "fastavro-1.8.2-cp39-cp39-macosx_11_0_x86_64.whl", hash = "sha256:38aca63ce604039bcdf2edd14912d00287bdbf8b76f9aa42b28e6ca0bf950092"},
+ {file = "fastavro-1.8.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9787835f6449ee94713e7993a700432fce3763024791ffa8a58dc91ef9d1f950"},
+ {file = "fastavro-1.8.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:536cb448bc83811056be02749fd9df37a69621678f02597d272970a769e9b40c"},
+ {file = "fastavro-1.8.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:e9d5027cf7d9968f8f819958b41bfedb933323ea6d6a0485eefacaa1afd91f54"},
+ {file = "fastavro-1.8.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:792adfc0c80c7f1109e0ab4b0decef20691fdf0a45091d397a0563872eb56d42"},
+ {file = "fastavro-1.8.2-cp39-cp39-win_amd64.whl", hash = "sha256:650b22766259f7dd7519dfa4e4658f0e233c319efa130b9cf0c36a500e09cc57"},
+ {file = "fastavro-1.8.2.tar.gz", hash = "sha256:ab9d9226d4b66b6b3d0661a57cd45259b0868fed1c0cd4fac95249b9e0973320"},
+]
+
+[package.extras]
+codecs = ["lz4", "python-snappy", "zstandard"]
+lz4 = ["lz4"]
+snappy = ["python-snappy"]
+zstandard = ["zstandard"]
+
[[package]]
name = "filelock"
version = "3.12.2"
description = "A platform independent file lock."
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -1423,7 +1427,6 @@ testing = ["covdefaults (>=2.3)", "coverage (>=7.2.7)", "diff-cover (>=7.5)", "p
name = "filetype"
version = "1.2.0"
description = "Infer file type and MIME type of any file/buffer. No external dependencies."
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -1435,7 +1438,6 @@ files = [
name = "flatbuffers"
version = "23.5.26"
description = "The FlatBuffers serialization format for Python"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -1447,7 +1449,6 @@ files = [
name = "frozenlist"
version = "1.4.0"
description = "A list-like structure which implements collections.abc.MutableSequence"
-category = "main"
optional = false
python-versions = ">=3.8"
files = [
@@ -1518,7 +1519,6 @@ files = [
name = "fsspec"
version = "2023.6.0"
description = "File-system specification"
-category = "main"
optional = false
python-versions = ">=3.8"
files = [
@@ -1554,7 +1554,6 @@ tqdm = ["tqdm"]
name = "gitdb"
version = "4.0.10"
description = "Git Object Database"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -1569,7 +1568,6 @@ smmap = ">=3.0.1,<6"
name = "gitpython"
version = "3.1.32"
description = "GitPython is a Python library used to interact with Git repositories"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -1584,7 +1582,6 @@ gitdb = ">=4.0.1,<5"
name = "google-api-core"
version = "2.11.1"
description = "Google API client core library"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -1607,18 +1604,17 @@ grpcio-gcp = ["grpcio-gcp (>=0.2.2,<1.0.dev0)"]
[[package]]
name = "google-api-python-client"
-version = "2.94.0"
+version = "2.96.0"
description = "Google API Client Library for Python"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "google-api-python-client-2.94.0.tar.gz", hash = "sha256:4ff598b7b83d5c0c5582927e74947286070b5b21a13e1bb64409fd92e45bfb26"},
- {file = "google_api_python_client-2.94.0-py2.py3-none-any.whl", hash = "sha256:28b2f0c2c6380a119e2efd7ecd28fa9d313becf37d71f00bfa49332428e071b4"},
+ {file = "google-api-python-client-2.96.0.tar.gz", hash = "sha256:f712373d03d338af57b9f5fe98c91f4b5baaa8765469b015bc623c4681c5bd51"},
+ {file = "google_api_python_client-2.96.0-py2.py3-none-any.whl", hash = "sha256:38c2b61b10d15bb41ec8f89303e3837ec2d2c3e4e38de5800c05ee322492f937"},
]
[package.dependencies]
-google-api-core = ">=1.31.5,<2.0.0 || >2.3.0,<3.0.0.dev0"
+google-api-core = ">=1.31.5,<2.0.dev0 || >2.3.0,<3.0.0.dev0"
google-auth = ">=1.19.0,<3.0.0.dev0"
google-auth-httplib2 = ">=0.1.0"
httplib2 = ">=0.15.0,<1.dev0"
@@ -1628,7 +1624,6 @@ uritemplate = ">=3.0.1,<5"
name = "google-auth"
version = "2.22.0"
description = "Google Authentication Library"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -1654,7 +1649,6 @@ requests = ["requests (>=2.20.0,<3.0.0.dev0)"]
name = "google-auth-httplib2"
version = "0.1.0"
description = "Google Authentication Library: httplib2 transport"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -1669,18 +1663,17 @@ six = "*"
[[package]]
name = "google-cloud-aiplatform"
-version = "1.28.1"
+version = "1.29.0"
description = "Vertex AI API client library"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "google-cloud-aiplatform-1.28.1.tar.gz", hash = "sha256:b6468db7dc50295c988edf6505f0bf4d4bb2321de28873b2a4a87fd384be6308"},
- {file = "google_cloud_aiplatform-1.28.1-py2.py3-none-any.whl", hash = "sha256:5587b8d4599047117b0c787635a00e8e3893b75944993e2faf784176442e9de4"},
+ {file = "google-cloud-aiplatform-1.29.0.tar.gz", hash = "sha256:fceabb924d2d26057e3c8c5c2e251929389aa6d553361377bc402781150c0db3"},
+ {file = "google_cloud_aiplatform-1.29.0-py2.py3-none-any.whl", hash = "sha256:cf81c1d93c61ccf3df60a65e3a5a1e465e044059d36b6fc1202b940c46c4c1e1"},
]
[package.dependencies]
-google-api-core = {version = ">=1.32.0,<2.0.0 || >=2.8.0,<3.0.0dev", extras = ["grpc"]}
+google-api-core = {version = ">=1.32.0,<2.0.dev0 || >=2.8.dev0,<3.0.0dev", extras = ["grpc"]}
google-cloud-bigquery = ">=1.15.0,<4.0.0dev"
google-cloud-resource-manager = ">=1.3.3,<3.0.0dev"
google-cloud-storage = ">=1.32.0,<3.0.0dev"
@@ -1709,7 +1702,6 @@ xai = ["tensorflow (>=2.3.0,<3.0.0dev)"]
name = "google-cloud-bigquery"
version = "3.11.4"
description = "Google BigQuery API client library"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -1718,7 +1710,7 @@ files = [
]
[package.dependencies]
-google-api-core = {version = ">=1.31.5,<2.0.0 || >2.3.0,<3.0.0dev", extras = ["grpc"]}
+google-api-core = {version = ">=1.31.5,<2.0.dev0 || >2.3.0,<3.0.0dev", extras = ["grpc"]}
google-cloud-core = ">=1.6.0,<3.0.0dev"
google-resumable-media = ">=0.6.0,<3.0dev"
grpcio = ">=1.47.0,<2.0dev"
@@ -1742,7 +1734,6 @@ tqdm = ["tqdm (>=4.7.4,<5.0.0dev)"]
name = "google-cloud-core"
version = "2.3.3"
description = "Google Cloud API client core library"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -1751,7 +1742,7 @@ files = [
]
[package.dependencies]
-google-api-core = ">=1.31.6,<2.0.0 || >2.3.0,<3.0.0dev"
+google-api-core = ">=1.31.6,<2.0.dev0 || >2.3.0,<3.0.0dev"
google-auth = ">=1.25.0,<3.0dev"
[package.extras]
@@ -1759,18 +1750,17 @@ grpc = ["grpcio (>=1.38.0,<2.0dev)"]
[[package]]
name = "google-cloud-resource-manager"
-version = "1.10.2"
+version = "1.10.3"
description = "Google Cloud Resource Manager API client library"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "google-cloud-resource-manager-1.10.2.tar.gz", hash = "sha256:9a7bdd0347ad553376cc66ad317c5223d1ae04bdcf74edcbfcd12605cff7b510"},
- {file = "google_cloud_resource_manager-1.10.2-py2.py3-none-any.whl", hash = "sha256:9e074c28326bd1632f1a270c20cfea1ffe98f49cf821033e65bdac55661ffbd5"},
+ {file = "google-cloud-resource-manager-1.10.3.tar.gz", hash = "sha256:f80efcea36f10c5a81889afe93910926e3978b4b1ceeb82f563a2fc863072d14"},
+ {file = "google_cloud_resource_manager-1.10.3-py2.py3-none-any.whl", hash = "sha256:1381a4b0f522248ebe0ebd1289d8822b99c54f4e1fe03924a6e723b2ed93dd7f"},
]
[package.dependencies]
-google-api-core = {version = ">=1.34.0,<2.0.0 || >=2.11.0,<3.0.0dev", extras = ["grpc"]}
+google-api-core = {version = ">=1.34.0,<2.0.dev0 || >=2.11.dev0,<3.0.0dev", extras = ["grpc"]}
grpc-google-iam-v1 = ">=0.12.4,<1.0.0dev"
proto-plus = ">=1.22.0,<2.0.0dev"
protobuf = ">=3.19.5,<3.20.0 || >3.20.0,<3.20.1 || >3.20.1,<4.21.0 || >4.21.0,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<5.0.0dev"
@@ -1779,7 +1769,6 @@ protobuf = ">=3.19.5,<3.20.0 || >3.20.0,<3.20.1 || >3.20.1,<4.21.0 || >4.21.0,<4
name = "google-cloud-storage"
version = "2.10.0"
description = "Google Cloud Storage API client library"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -1788,7 +1777,7 @@ files = [
]
[package.dependencies]
-google-api-core = ">=1.31.5,<2.0.0 || >2.3.0,<3.0.0dev"
+google-api-core = ">=1.31.5,<2.0.dev0 || >2.3.0,<3.0.0dev"
google-auth = ">=1.25.0,<3.0dev"
google-cloud-core = ">=2.3.0,<3.0dev"
google-resumable-media = ">=2.3.2"
@@ -1801,7 +1790,6 @@ protobuf = ["protobuf (<5.0.0dev)"]
name = "google-crc32c"
version = "1.5.0"
description = "A python wrapper of the C library 'Google CRC32C'"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -1882,7 +1870,6 @@ testing = ["pytest"]
name = "google-resumable-media"
version = "2.5.0"
description = "Utilities for Google Media Downloads and Resumable Uploads"
-category = "main"
optional = false
python-versions = ">= 3.7"
files = [
@@ -1901,7 +1888,6 @@ requests = ["requests (>=2.18.0,<3.0.0dev)"]
name = "google-search-results"
version = "2.4.2"
description = "Scrape and search localized results from Google, Bing, Baidu, Yahoo, Yandex, Ebay, Homedepot, youtube at scale using SerpApi.com"
-category = "main"
optional = false
python-versions = ">=3.5"
files = [
@@ -1913,14 +1899,13 @@ requests = "*"
[[package]]
name = "googleapis-common-protos"
-version = "1.59.1"
+version = "1.60.0"
description = "Common protobufs used in Google APIs"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "googleapis-common-protos-1.59.1.tar.gz", hash = "sha256:b35d530fe825fb4227857bc47ad84c33c809ac96f312e13182bdeaa2abe1178a"},
- {file = "googleapis_common_protos-1.59.1-py2.py3-none-any.whl", hash = "sha256:0cbedb6fb68f1c07e18eb4c48256320777707e7d0c55063ae56c15db3224a61e"},
+ {file = "googleapis-common-protos-1.60.0.tar.gz", hash = "sha256:e73ebb404098db405ba95d1e1ae0aa91c3e15a71da031a2eeb6b2e23e7bc3708"},
+ {file = "googleapis_common_protos-1.60.0-py2.py3-none-any.whl", hash = "sha256:69f9bbcc6acde92cab2db95ce30a70bd2b81d20b12eff3f1aabaffcbe8a93918"},
]
[package.dependencies]
@@ -1934,7 +1919,6 @@ grpc = ["grpcio (>=1.44.0,<2.0.0.dev0)"]
name = "gotrue"
version = "1.0.2"
description = "Python Client Library for GoTrue"
-category = "main"
optional = false
python-versions = ">=3.8,<4.0"
files = [
@@ -1950,7 +1934,6 @@ pydantic = ">=1.10.0,<2.0.0"
name = "greenlet"
version = "2.0.2"
description = "Lightweight in-process concurrent programming"
-category = "main"
optional = false
python-versions = ">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*"
files = [
@@ -2024,7 +2007,6 @@ test = ["objgraph", "psutil"]
name = "grpc-google-iam-v1"
version = "0.12.6"
description = "IAM API client library"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -2041,7 +2023,6 @@ protobuf = ">=3.19.5,<3.20.0 || >3.20.0,<3.20.1 || >3.20.1,<4.21.1 || >4.21.1,<4
name = "grpcio"
version = "1.47.5"
description = "HTTP/2-based RPC framework"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -2103,7 +2084,6 @@ protobuf = ["grpcio-tools (>=1.47.5)"]
name = "grpcio-health-checking"
version = "1.47.5"
description = "Standard Health Checking Service for gRPC"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -2119,7 +2099,6 @@ protobuf = ">=3.12.0"
name = "grpcio-reflection"
version = "1.47.5"
description = "Standard Protobuf Reflection Service for gRPC"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -2135,7 +2114,6 @@ protobuf = ">=3.12.0"
name = "grpcio-status"
version = "1.47.5"
description = "Status proto mapping for gRPC"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -2152,7 +2130,6 @@ protobuf = ">=3.12.0"
name = "grpcio-tools"
version = "1.47.5"
description = "Protobuf code generator for gRPC"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -2213,7 +2190,6 @@ setuptools = "*"
name = "gunicorn"
version = "21.2.0"
description = "WSGI HTTP Server for UNIX"
-category = "main"
optional = false
python-versions = ">=3.5"
files = [
@@ -2234,7 +2210,6 @@ tornado = ["tornado (>=0.2)"]
name = "h11"
version = "0.14.0"
description = "A pure-Python, bring-your-own-I/O implementation of HTTP/1.1"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -2246,7 +2221,6 @@ files = [
name = "h2"
version = "4.1.0"
description = "HTTP/2 State-Machine based protocol implementation"
-category = "main"
optional = false
python-versions = ">=3.6.1"
files = [
@@ -2262,7 +2236,6 @@ hyperframe = ">=6.0,<7"
name = "hnswlib"
version = "0.7.0"
description = "hnswlib"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -2276,7 +2249,6 @@ numpy = "*"
name = "hpack"
version = "4.0.0"
description = "Pure-Python HPACK header compression"
-category = "main"
optional = false
python-versions = ">=3.6.1"
files = [
@@ -2288,7 +2260,6 @@ files = [
name = "httpcore"
version = "0.16.3"
description = "A minimal low-level HTTP client."
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -2300,17 +2271,16 @@ files = [
anyio = ">=3.0,<5.0"
certifi = "*"
h11 = ">=0.13,<0.15"
-sniffio = ">=1.0.0,<2.0.0"
+sniffio = "==1.*"
[package.extras]
http2 = ["h2 (>=3,<5)"]
-socks = ["socksio (>=1.0.0,<2.0.0)"]
+socks = ["socksio (==1.*)"]
[[package]]
name = "httplib2"
version = "0.22.0"
description = "A comprehensive HTTP client library."
-category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
files = [
@@ -2325,7 +2295,6 @@ pyparsing = {version = ">=2.4.2,<3.0.0 || >3.0.0,<3.0.1 || >3.0.1,<3.0.2 || >3.0
name = "httptools"
version = "0.6.0"
description = "A collection of framework independent HTTP protocol utils."
-category = "main"
optional = false
python-versions = ">=3.5.0"
files = [
@@ -2373,7 +2342,6 @@ test = ["Cython (>=0.29.24,<0.30.0)"]
name = "httpx"
version = "0.23.3"
description = "The next generation HTTP client."
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -2390,47 +2358,48 @@ sniffio = "*"
[package.extras]
brotli = ["brotli", "brotlicffi"]
-cli = ["click (>=8.0.0,<9.0.0)", "pygments (>=2.0.0,<3.0.0)", "rich (>=10,<13)"]
+cli = ["click (==8.*)", "pygments (==2.*)", "rich (>=10,<13)"]
http2 = ["h2 (>=3,<5)"]
-socks = ["socksio (>=1.0.0,<2.0.0)"]
+socks = ["socksio (==1.*)"]
[[package]]
name = "huggingface-hub"
-version = "0.15.1"
+version = "0.16.4"
description = "Client library to download and publish models, datasets and other repos on the huggingface.co hub"
-category = "main"
optional = false
python-versions = ">=3.7.0"
files = [
- {file = "huggingface_hub-0.15.1-py3-none-any.whl", hash = "sha256:05b0fb0abbf1f625dfee864648ac3049fe225ac4371c7bafaca0c2d3a2f83445"},
- {file = "huggingface_hub-0.15.1.tar.gz", hash = "sha256:a61b7d1a7769fe10119e730277c72ab99d95c48d86a3d6da3e9f3d0f632a4081"},
+ {file = "huggingface_hub-0.16.4-py3-none-any.whl", hash = "sha256:0d3df29932f334fead024afc7cb4cc5149d955238b8b5e42dcf9740d6995a349"},
+ {file = "huggingface_hub-0.16.4.tar.gz", hash = "sha256:608c7d4f3d368b326d1747f91523dbd1f692871e8e2e7a4750314a2dd8b63e14"},
]
[package.dependencies]
+aiohttp = {version = "*", optional = true, markers = "extra == \"inference\""}
filelock = "*"
fsspec = "*"
packaging = ">=20.9"
+pydantic = {version = "*", optional = true, markers = "extra == \"inference\""}
pyyaml = ">=5.1"
requests = "*"
tqdm = ">=4.42.1"
typing-extensions = ">=3.7.4.3"
[package.extras]
-all = ["InquirerPy (==0.3.4)", "Jinja2", "Pillow", "black (>=23.1,<24.0)", "gradio", "jedi", "mypy (==0.982)", "numpy", "pytest", "pytest-cov", "pytest-env", "pytest-vcr", "pytest-xdist", "ruff (>=0.0.241)", "soundfile", "types-PyYAML", "types-requests", "types-simplejson", "types-toml", "types-tqdm", "types-urllib3", "urllib3 (<2.0)"]
+all = ["InquirerPy (==0.3.4)", "Jinja2", "Pillow", "aiohttp", "black (>=23.1,<24.0)", "gradio", "jedi", "mypy (==0.982)", "numpy", "pydantic", "pytest", "pytest-asyncio", "pytest-cov", "pytest-env", "pytest-vcr", "pytest-xdist", "ruff (>=0.0.241)", "soundfile", "types-PyYAML", "types-requests", "types-simplejson", "types-toml", "types-tqdm", "types-urllib3", "urllib3 (<2.0)"]
cli = ["InquirerPy (==0.3.4)"]
-dev = ["InquirerPy (==0.3.4)", "Jinja2", "Pillow", "black (>=23.1,<24.0)", "gradio", "jedi", "mypy (==0.982)", "numpy", "pytest", "pytest-cov", "pytest-env", "pytest-vcr", "pytest-xdist", "ruff (>=0.0.241)", "soundfile", "types-PyYAML", "types-requests", "types-simplejson", "types-toml", "types-tqdm", "types-urllib3", "urllib3 (<2.0)"]
+dev = ["InquirerPy (==0.3.4)", "Jinja2", "Pillow", "aiohttp", "black (>=23.1,<24.0)", "gradio", "jedi", "mypy (==0.982)", "numpy", "pydantic", "pytest", "pytest-asyncio", "pytest-cov", "pytest-env", "pytest-vcr", "pytest-xdist", "ruff (>=0.0.241)", "soundfile", "types-PyYAML", "types-requests", "types-simplejson", "types-toml", "types-tqdm", "types-urllib3", "urllib3 (<2.0)"]
fastai = ["fastai (>=2.4)", "fastcore (>=1.3.27)", "toml"]
+inference = ["aiohttp", "pydantic"]
quality = ["black (>=23.1,<24.0)", "mypy (==0.982)", "ruff (>=0.0.241)"]
tensorflow = ["graphviz", "pydot", "tensorflow"]
-testing = ["InquirerPy (==0.3.4)", "Jinja2", "Pillow", "gradio", "jedi", "numpy", "pytest", "pytest-cov", "pytest-env", "pytest-vcr", "pytest-xdist", "soundfile", "urllib3 (<2.0)"]
+testing = ["InquirerPy (==0.3.4)", "Jinja2", "Pillow", "aiohttp", "gradio", "jedi", "numpy", "pydantic", "pytest", "pytest-asyncio", "pytest-cov", "pytest-env", "pytest-vcr", "pytest-xdist", "soundfile", "urllib3 (<2.0)"]
torch = ["torch"]
-typing = ["types-PyYAML", "types-requests", "types-simplejson", "types-toml", "types-tqdm", "types-urllib3"]
+typing = ["pydantic", "types-PyYAML", "types-requests", "types-simplejson", "types-toml", "types-tqdm", "types-urllib3"]
[[package]]
name = "humanfriendly"
version = "10.0"
description = "Human friendly output for text interfaces using Python"
-category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
files = [
@@ -2445,7 +2414,6 @@ pyreadline3 = {version = "*", markers = "sys_platform == \"win32\" and python_ve
name = "hyperframe"
version = "6.0.1"
description = "HTTP/2 framing layer for Python"
-category = "main"
optional = false
python-versions = ">=3.6.1"
files = [
@@ -2457,7 +2425,6 @@ files = [
name = "idna"
version = "3.4"
description = "Internationalized Domain Names in Applications (IDNA)"
-category = "main"
optional = false
python-versions = ">=3.5"
files = [
@@ -2469,7 +2436,6 @@ files = [
name = "importlib-metadata"
version = "6.8.0"
description = "Read metadata from Python packages"
-category = "main"
optional = false
python-versions = ">=3.8"
files = [
@@ -2487,14 +2453,13 @@ testing = ["flufl.flake8", "importlib-resources (>=1.3)", "packaging", "pyfakefs
[[package]]
name = "importlib-resources"
-version = "6.0.0"
+version = "6.0.1"
description = "Read resources from Python packages"
-category = "main"
optional = false
python-versions = ">=3.8"
files = [
- {file = "importlib_resources-6.0.0-py3-none-any.whl", hash = "sha256:d952faee11004c045f785bb5636e8f885bed30dc3c940d5d42798a2a4541c185"},
- {file = "importlib_resources-6.0.0.tar.gz", hash = "sha256:4cf94875a8368bd89531a756df9a9ebe1f150e0f885030b461237bc7f2d905f2"},
+ {file = "importlib_resources-6.0.1-py3-none-any.whl", hash = "sha256:134832a506243891221b88b4ae1213327eea96ceb4e407a00d790bb0626f45cf"},
+ {file = "importlib_resources-6.0.1.tar.gz", hash = "sha256:4359457e42708462b9626a04657c6208ad799ceb41e5c58c57ffa0e6a098a5d4"},
]
[package.dependencies]
@@ -2508,7 +2473,6 @@ testing = ["pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)",
name = "iniconfig"
version = "2.0.0"
description = "brain-dead simple config-ini parsing"
-category = "dev"
optional = false
python-versions = ">=3.7"
files = [
@@ -2520,7 +2484,6 @@ files = [
name = "invoke"
version = "1.7.3"
description = "Pythonic task execution"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -2530,14 +2493,13 @@ files = [
[[package]]
name = "ipykernel"
-version = "6.24.0"
+version = "6.25.1"
description = "IPython Kernel for Jupyter"
-category = "dev"
optional = false
python-versions = ">=3.8"
files = [
- {file = "ipykernel-6.24.0-py3-none-any.whl", hash = "sha256:2f5fffc7ad8f1fd5aadb4e171ba9129d9668dbafa374732cf9511ada52d6547f"},
- {file = "ipykernel-6.24.0.tar.gz", hash = "sha256:29cea0a716b1176d002a61d0b0c851f34536495bc4ef7dd0222c88b41b816123"},
+ {file = "ipykernel-6.25.1-py3-none-any.whl", hash = "sha256:c8a2430b357073b37c76c21c52184db42f6b4b0e438e1eb7df3c4440d120497c"},
+ {file = "ipykernel-6.25.1.tar.gz", hash = "sha256:050391364c0977e768e354bdb60cbbfbee7cbb943b1af1618382021136ffd42f"},
]
[package.dependencies]
@@ -2546,7 +2508,7 @@ comm = ">=0.1.1"
debugpy = ">=1.6.5"
ipython = ">=7.23.1"
jupyter-client = ">=6.1.12"
-jupyter-core = ">=4.12,<5.0.0 || >=5.1.0"
+jupyter-core = ">=4.12,<5.0.dev0 || >=5.1.dev0"
matplotlib-inline = ">=0.1"
nest-asyncio = "*"
packaging = "*"
@@ -2566,7 +2528,6 @@ test = ["flaky", "ipyparallel", "pre-commit", "pytest (>=7.0)", "pytest-asyncio"
name = "ipython"
version = "8.14.0"
description = "IPython: Productive Interactive Computing"
-category = "dev"
optional = false
python-versions = ">=3.9"
files = [
@@ -2606,7 +2567,6 @@ test-extra = ["curio", "matplotlib (!=3.2.0)", "nbformat", "numpy (>=1.21)", "pa
name = "jaraco-classes"
version = "3.3.0"
description = "Utility functions for Python class constructs"
-category = "main"
optional = false
python-versions = ">=3.8"
files = [
@@ -2623,13 +2583,12 @@ testing = ["pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)",
[[package]]
name = "jcloud"
-version = "0.2.12"
+version = "0.2.16"
description = "Simplify deploying and managing Jina projects on Jina Cloud"
-category = "main"
optional = false
python-versions = "*"
files = [
- {file = "jcloud-0.2.12.tar.gz", hash = "sha256:b7bae0909e4f09267aaa681c59e86ae27d3812fb7b63ad018bfbbcfdf8c0ad1e"},
+ {file = "jcloud-0.2.16.tar.gz", hash = "sha256:abf39a70fc5852574a05e03dac5e5cc364df87ae1f8476cbd441178f14adb578"},
]
[package.dependencies]
@@ -2646,29 +2605,27 @@ test = ["black (==22.3.0)", "jina (>=3.7.0)", "mock", "pytest", "pytest-asyncio"
[[package]]
name = "jedi"
-version = "0.18.2"
+version = "0.19.0"
description = "An autocompletion tool for Python that can be used for text editors."
-category = "dev"
optional = false
python-versions = ">=3.6"
files = [
- {file = "jedi-0.18.2-py2.py3-none-any.whl", hash = "sha256:203c1fd9d969ab8f2119ec0a3342e0b49910045abe6af0a3ae83a5764d54639e"},
- {file = "jedi-0.18.2.tar.gz", hash = "sha256:bae794c30d07f6d910d32a7048af09b5a39ed740918da923c6b780790ebac612"},
+ {file = "jedi-0.19.0-py2.py3-none-any.whl", hash = "sha256:cb8ce23fbccff0025e9386b5cf85e892f94c9b822378f8da49970471335ac64e"},
+ {file = "jedi-0.19.0.tar.gz", hash = "sha256:bcf9894f1753969cbac8022a8c2eaee06bfa3724e4192470aaffe7eb6272b0c4"},
]
[package.dependencies]
-parso = ">=0.8.0,<0.9.0"
+parso = ">=0.8.3,<0.9.0"
[package.extras]
docs = ["Jinja2 (==2.11.3)", "MarkupSafe (==1.1.1)", "Pygments (==2.8.1)", "alabaster (==0.7.12)", "babel (==2.9.1)", "chardet (==4.0.0)", "commonmark (==0.8.1)", "docutils (==0.17.1)", "future (==0.18.2)", "idna (==2.10)", "imagesize (==1.2.0)", "mock (==1.0.1)", "packaging (==20.9)", "pyparsing (==2.4.7)", "pytz (==2021.1)", "readthedocs-sphinx-ext (==2.1.4)", "recommonmark (==0.5.0)", "requests (==2.25.1)", "six (==1.15.0)", "snowballstemmer (==2.1.0)", "sphinx (==1.8.5)", "sphinx-rtd-theme (==0.4.3)", "sphinxcontrib-serializinghtml (==1.1.4)", "sphinxcontrib-websupport (==1.2.4)", "urllib3 (==1.26.4)"]
-qa = ["flake8 (==3.8.3)", "mypy (==0.782)"]
+qa = ["flake8 (==5.0.4)", "mypy (==0.971)", "types-setuptools (==67.2.0.1)"]
testing = ["Django (<3.1)", "attrs", "colorama", "docopt", "pytest (<7.0.0)"]
[[package]]
name = "jeepney"
version = "0.8.0"
description = "Low-level, pure Python DBus protocol wrapper."
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -2684,7 +2641,6 @@ trio = ["async_generator", "trio"]
name = "jina"
version = "3.15.2"
description = "Build multimodal AI services via cloud native technologies ยท Neural Search ยท Generative AI ยท MLOps"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -2802,7 +2758,6 @@ websockets = ["websockets"]
name = "jina-hubble-sdk"
version = "0.39.0"
description = "SDK for Hubble API at Jina AI."
-category = "main"
optional = false
python-versions = ">=3.7.0"
files = [
@@ -2828,8 +2783,7 @@ full = ["aiohttp", "black (==22.3.0)", "docker", "filelock", "flake8 (==4.0.1)",
name = "jinja2"
version = "3.1.2"
description = "A very fast and expressive template engine."
-category = "main"
-optional = false
+optional = true
python-versions = ">=3.7"
files = [
{file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
@@ -2844,21 +2798,19 @@ i18n = ["Babel (>=2.7)"]
[[package]]
name = "joblib"
-version = "1.3.1"
+version = "1.3.2"
description = "Lightweight pipelining with Python functions"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "joblib-1.3.1-py3-none-any.whl", hash = "sha256:89cf0529520e01b3de7ac7b74a8102c90d16d54c64b5dd98cafcd14307fdf915"},
- {file = "joblib-1.3.1.tar.gz", hash = "sha256:1f937906df65329ba98013dc9692fe22a4c5e4a648112de500508b18a21b41e3"},
+ {file = "joblib-1.3.2-py3-none-any.whl", hash = "sha256:ef4331c65f239985f3f2220ecc87db222f08fd22097a3dd5698f693875f8cbb9"},
+ {file = "joblib-1.3.2.tar.gz", hash = "sha256:92f865e621e17784e7955080b6d042489e3b8e294949cc44c6eac304f59772b1"},
]
[[package]]
name = "jupyter-client"
version = "8.3.0"
description = "Jupyter protocol implementation and client libraries"
-category = "dev"
optional = false
python-versions = ">=3.8"
files = [
@@ -2868,7 +2820,7 @@ files = [
[package.dependencies]
importlib-metadata = {version = ">=4.8.3", markers = "python_version < \"3.10\""}
-jupyter-core = ">=4.12,<5.0.0 || >=5.1.0"
+jupyter-core = ">=4.12,<5.0.dev0 || >=5.1.dev0"
python-dateutil = ">=2.8.2"
pyzmq = ">=23.0"
tornado = ">=6.2"
@@ -2882,7 +2834,6 @@ test = ["coverage", "ipykernel (>=6.14)", "mypy", "paramiko", "pre-commit", "pyt
name = "jupyter-core"
version = "5.3.1"
description = "Jupyter core package. A base package on which Jupyter projects rely."
-category = "dev"
optional = false
python-versions = ">=3.8"
files = [
@@ -2903,7 +2854,6 @@ test = ["ipykernel", "pre-commit", "pytest", "pytest-cov", "pytest-timeout"]
name = "keyring"
version = "24.2.0"
description = "Store and access your passwords safely."
-category = "main"
optional = false
python-versions = ">=3.8"
files = [
@@ -2925,14 +2875,13 @@ testing = ["pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)",
[[package]]
name = "langchain"
-version = "0.0.240"
+version = "0.0.256"
description = "Building applications with LLMs through composability"
-category = "main"
optional = false
python-versions = ">=3.8.1,<4.0"
files = [
- {file = "langchain-0.0.240-py3-none-any.whl", hash = "sha256:110d68116b9bf4eff3aa342a3d6e521f841f9af22fcc02ed52699ada41a46b90"},
- {file = "langchain-0.0.240.tar.gz", hash = "sha256:698669880d94498ce90f33b28222d46be6297c7b280a399612af7e7a5af39dd6"},
+ {file = "langchain-0.0.256-py3-none-any.whl", hash = "sha256:3389fcb85d8d4fb16bae5ca9995d3ce634a3330f8ac1f458afc6171e4ca52de5"},
+ {file = "langchain-0.0.256.tar.gz", hash = "sha256:b80115e19f86199c49bca8ef18c09d2d87548332a0144a1c5ce6a2f82e4f5f9c"},
]
[package.dependencies]
@@ -2944,39 +2893,52 @@ numexpr = ">=2.8.4,<3.0.0"
numpy = ">=1,<2"
openapi-schema-pydantic = ">=1.2,<2.0"
pydantic = ">=1,<2"
-PyYAML = ">=5.4.1"
+PyYAML = ">=5.3"
requests = ">=2,<3"
SQLAlchemy = ">=1.4,<3"
tenacity = ">=8.1.0,<9.0.0"
[package.extras]
-all = ["O365 (>=2.0.26,<3.0.0)", "aleph-alpha-client (>=2.15.0,<3.0.0)", "amadeus (>=8.1.0)", "anthropic (>=0.3,<0.4)", "arxiv (>=1.4,<2.0)", "atlassian-python-api (>=3.36.0,<4.0.0)", "awadb (>=0.3.3,<0.4.0)", "azure-ai-formrecognizer (>=3.2.1,<4.0.0)", "azure-ai-vision (>=0.11.1b1,<0.12.0)", "azure-cognitiveservices-speech (>=1.28.0,<2.0.0)", "azure-cosmos (>=4.4.0b1,<5.0.0)", "azure-identity (>=1.12.0,<2.0.0)", "beautifulsoup4 (>=4,<5)", "clarifai (>=9.1.0)", "clickhouse-connect (>=0.5.14,<0.6.0)", "cohere (>=3,<4)", "deeplake (>=3.6.8,<4.0.0)", "docarray[hnswlib] (>=0.32.0,<0.33.0)", "duckduckgo-search (>=3.8.3,<4.0.0)", "elasticsearch (>=8,<9)", "esprima (>=4.0.1,<5.0.0)", "faiss-cpu (>=1,<2)", "google-api-python-client (==2.70.0)", "google-auth (>=2.18.1,<3.0.0)", "google-search-results (>=2,<3)", "gptcache (>=0.1.7)", "html2text (>=2020.1.16,<2021.0.0)", "huggingface_hub (>=0,<1)", "jina (>=3.14,<4.0)", "jinja2 (>=3,<4)", "jq (>=1.4.1,<2.0.0)", "lancedb (>=0.1,<0.2)", "langkit (>=0.0.6,<0.1.0)", "lark (>=1.1.5,<2.0.0)", "libdeeplake (>=0.0.60,<0.0.61)", "lxml (>=4.9.2,<5.0.0)", "manifest-ml (>=0.0.1,<0.0.2)", "marqo (>=0.11.0,<0.12.0)", "momento (>=1.5.0,<2.0.0)", "nebula3-python (>=3.4.0,<4.0.0)", "neo4j (>=5.8.1,<6.0.0)", "networkx (>=2.6.3,<3.0.0)", "nlpcloud (>=1,<2)", "nltk (>=3,<4)", "nomic (>=1.0.43,<2.0.0)", "octoai-sdk (>=0.1.1,<0.2.0)", "openai (>=0,<1)", "openlm (>=0.0.5,<0.0.6)", "opensearch-py (>=2.0.0,<3.0.0)", "pdfminer-six (>=20221105,<20221106)", "pexpect (>=4.8.0,<5.0.0)", "pgvector (>=0.1.6,<0.2.0)", "pinecone-client (>=2,<3)", "pinecone-text (>=0.4.2,<0.5.0)", "psycopg2-binary (>=2.9.5,<3.0.0)", "pymongo (>=4.3.3,<5.0.0)", "pyowm (>=3.3.0,<4.0.0)", "pypdf (>=3.4.0,<4.0.0)", "pytesseract (>=0.3.10,<0.4.0)", "pyvespa (>=0.33.0,<0.34.0)", "qdrant-client (>=1.3.1,<2.0.0)", "rdflib (>=6.3.2,<7.0.0)", "redis (>=4,<5)", "requests-toolbelt (>=1.0.0,<2.0.0)", "sentence-transformers (>=2,<3)", "singlestoredb (>=0.7.1,<0.8.0)", "spacy (>=3,<4)", "steamship (>=2.16.9,<3.0.0)", "tensorflow-text (>=2.11.0,<3.0.0)", "tigrisdb (>=1.0.0b6,<2.0.0)", "tiktoken (>=0.3.2,<0.4.0)", "torch (>=1,<3)", "transformers (>=4,<5)", "weaviate-client (>=3,<4)", "wikipedia (>=1,<2)", "wolframalpha (==5.0.0)"]
-azure = ["azure-ai-formrecognizer (>=3.2.1,<4.0.0)", "azure-ai-vision (>=0.11.1b1,<0.12.0)", "azure-cognitiveservices-speech (>=1.28.0,<2.0.0)", "azure-core (>=1.26.4,<2.0.0)", "azure-cosmos (>=4.4.0b1,<5.0.0)", "azure-identity (>=1.12.0,<2.0.0)", "azure-search-documents (==11.4.0a20230509004)", "openai (>=0,<1)"]
+all = ["O365 (>=2.0.26,<3.0.0)", "aleph-alpha-client (>=2.15.0,<3.0.0)", "amadeus (>=8.1.0)", "anthropic (>=0.3,<0.4)", "arxiv (>=1.4,<2.0)", "atlassian-python-api (>=3.36.0,<4.0.0)", "awadb (>=0.3.9,<0.4.0)", "azure-ai-formrecognizer (>=3.2.1,<4.0.0)", "azure-ai-vision (>=0.11.1b1,<0.12.0)", "azure-cognitiveservices-speech (>=1.28.0,<2.0.0)", "azure-cosmos (>=4.4.0b1,<5.0.0)", "azure-identity (>=1.12.0,<2.0.0)", "beautifulsoup4 (>=4,<5)", "clarifai (>=9.1.0)", "clickhouse-connect (>=0.5.14,<0.6.0)", "cohere (>=4,<5)", "deeplake (>=3.6.8,<4.0.0)", "docarray[hnswlib] (>=0.32.0,<0.33.0)", "duckduckgo-search (>=3.8.3,<4.0.0)", "elasticsearch (>=8,<9)", "esprima (>=4.0.1,<5.0.0)", "faiss-cpu (>=1,<2)", "google-api-python-client (==2.70.0)", "google-auth (>=2.18.1,<3.0.0)", "google-search-results (>=2,<3)", "gptcache (>=0.1.7)", "html2text (>=2020.1.16,<2021.0.0)", "huggingface_hub (>=0,<1)", "jina (>=3.14,<4.0)", "jinja2 (>=3,<4)", "jq (>=1.4.1,<2.0.0)", "lancedb (>=0.1,<0.2)", "langkit (>=0.0.6,<0.1.0)", "lark (>=1.1.5,<2.0.0)", "libdeeplake (>=0.0.60,<0.0.61)", "librosa (>=0.10.0.post2,<0.11.0)", "lxml (>=4.9.2,<5.0.0)", "manifest-ml (>=0.0.1,<0.0.2)", "marqo (>=0.11.0,<0.12.0)", "momento (>=1.5.0,<2.0.0)", "nebula3-python (>=3.4.0,<4.0.0)", "neo4j (>=5.8.1,<6.0.0)", "networkx (>=2.6.3,<3.0.0)", "nlpcloud (>=1,<2)", "nltk (>=3,<4)", "nomic (>=1.0.43,<2.0.0)", "octoai-sdk (>=0.1.1,<0.2.0)", "openai (>=0,<1)", "openlm (>=0.0.5,<0.0.6)", "opensearch-py (>=2.0.0,<3.0.0)", "pdfminer-six (>=20221105,<20221106)", "pexpect (>=4.8.0,<5.0.0)", "pgvector (>=0.1.6,<0.2.0)", "pinecone-client (>=2,<3)", "pinecone-text (>=0.4.2,<0.5.0)", "psycopg2-binary (>=2.9.5,<3.0.0)", "pymongo (>=4.3.3,<5.0.0)", "pyowm (>=3.3.0,<4.0.0)", "pypdf (>=3.4.0,<4.0.0)", "pytesseract (>=0.3.10,<0.4.0)", "python-arango (>=7.5.9,<8.0.0)", "pyvespa (>=0.33.0,<0.34.0)", "qdrant-client (>=1.3.1,<2.0.0)", "rdflib (>=6.3.2,<7.0.0)", "redis (>=4,<5)", "requests-toolbelt (>=1.0.0,<2.0.0)", "sentence-transformers (>=2,<3)", "singlestoredb (>=0.7.1,<0.8.0)", "spacy (>=3,<4)", "steamship (>=2.16.9,<3.0.0)", "tensorflow-text (>=2.11.0,<3.0.0)", "tigrisdb (>=1.0.0b6,<2.0.0)", "tiktoken (>=0.3.2,<0.4.0)", "torch (>=1,<3)", "transformers (>=4,<5)", "weaviate-client (>=3,<4)", "wikipedia (>=1,<2)", "wolframalpha (==5.0.0)", "xinference (>=0.0.6,<0.0.7)"]
+azure = ["azure-ai-formrecognizer (>=3.2.1,<4.0.0)", "azure-ai-vision (>=0.11.1b1,<0.12.0)", "azure-cognitiveservices-speech (>=1.28.0,<2.0.0)", "azure-core (>=1.26.4,<2.0.0)", "azure-cosmos (>=4.4.0b1,<5.0.0)", "azure-identity (>=1.12.0,<2.0.0)", "azure-search-documents (==11.4.0b6)", "openai (>=0,<1)"]
clarifai = ["clarifai (>=9.1.0)"]
-cohere = ["cohere (>=3,<4)"]
+cohere = ["cohere (>=4,<5)"]
docarray = ["docarray[hnswlib] (>=0.32.0,<0.33.0)"]
embeddings = ["sentence-transformers (>=2,<3)"]
-extended-testing = ["atlassian-python-api (>=3.36.0,<4.0.0)", "beautifulsoup4 (>=4,<5)", "bibtexparser (>=1.4.0,<2.0.0)", "cassio (>=0.0.7,<0.0.8)", "chardet (>=5.1.0,<6.0.0)", "esprima (>=4.0.1,<5.0.0)", "geopandas (>=0.13.1,<0.14.0)", "gql (>=3.4.1,<4.0.0)", "html2text (>=2020.1.16,<2021.0.0)", "jinja2 (>=3,<4)", "jq (>=1.4.1,<2.0.0)", "lxml (>=4.9.2,<5.0.0)", "mwparserfromhell (>=0.6.4,<0.7.0)", "mwxml (>=0.3.3,<0.4.0)", "openai (>=0,<1)", "pandas (>=2.0.1,<3.0.0)", "pdfminer-six (>=20221105,<20221106)", "pgvector (>=0.1.6,<0.2.0)", "psychicapi (>=0.8.0,<0.9.0)", "py-trello (>=0.19.0,<0.20.0)", "pymupdf (>=1.22.3,<2.0.0)", "pypdf (>=3.4.0,<4.0.0)", "pypdfium2 (>=4.10.0,<5.0.0)", "pyspark (>=3.4.0,<4.0.0)", "rank-bm25 (>=0.2.2,<0.3.0)", "rapidfuzz (>=3.1.1,<4.0.0)", "requests-toolbelt (>=1.0.0,<2.0.0)", "scikit-learn (>=1.2.2,<2.0.0)", "streamlit (>=1.18.0,<2.0.0)", "sympy (>=1.12,<2.0)", "telethon (>=1.28.5,<2.0.0)", "tqdm (>=4.48.0)", "zep-python (>=0.32)"]
+extended-testing = ["amazon-textract-caller (<2)", "atlassian-python-api (>=3.36.0,<4.0.0)", "beautifulsoup4 (>=4,<5)", "bibtexparser (>=1.4.0,<2.0.0)", "cassio (>=0.0.7,<0.0.8)", "chardet (>=5.1.0,<6.0.0)", "esprima (>=4.0.1,<5.0.0)", "feedparser (>=6.0.10,<7.0.0)", "geopandas (>=0.13.1,<0.14.0)", "gitpython (>=3.1.32,<4.0.0)", "gql (>=3.4.1,<4.0.0)", "html2text (>=2020.1.16,<2021.0.0)", "jinja2 (>=3,<4)", "jq (>=1.4.1,<2.0.0)", "lxml (>=4.9.2,<5.0.0)", "mwparserfromhell (>=0.6.4,<0.7.0)", "mwxml (>=0.3.3,<0.4.0)", "newspaper3k (>=0.2.8,<0.3.0)", "openai (>=0,<1)", "pandas (>=2.0.1,<3.0.0)", "pdfminer-six (>=20221105,<20221106)", "pgvector (>=0.1.6,<0.2.0)", "psychicapi (>=0.8.0,<0.9.0)", "py-trello (>=0.19.0,<0.20.0)", "pymupdf (>=1.22.3,<2.0.0)", "pypdf (>=3.4.0,<4.0.0)", "pypdfium2 (>=4.10.0,<5.0.0)", "pyspark (>=3.4.0,<4.0.0)", "rank-bm25 (>=0.2.2,<0.3.0)", "rapidfuzz (>=3.1.1,<4.0.0)", "requests-toolbelt (>=1.0.0,<2.0.0)", "scikit-learn (>=1.2.2,<2.0.0)", "streamlit (>=1.18.0,<2.0.0)", "sympy (>=1.12,<2.0)", "telethon (>=1.28.5,<2.0.0)", "tqdm (>=4.48.0)", "xata (>=1.0.0a7,<2.0.0)", "xinference (>=0.0.6,<0.0.7)", "zep-python (>=0.32)"]
javascript = ["esprima (>=4.0.1,<5.0.0)"]
-llms = ["anthropic (>=0.3,<0.4)", "clarifai (>=9.1.0)", "cohere (>=3,<4)", "huggingface_hub (>=0,<1)", "manifest-ml (>=0.0.1,<0.0.2)", "nlpcloud (>=1,<2)", "openai (>=0,<1)", "openllm (>=0.1.19)", "openlm (>=0.0.5,<0.0.6)", "torch (>=1,<3)", "transformers (>=4,<5)"]
+llms = ["anthropic (>=0.3,<0.4)", "clarifai (>=9.1.0)", "cohere (>=4,<5)", "huggingface_hub (>=0,<1)", "manifest-ml (>=0.0.1,<0.0.2)", "nlpcloud (>=1,<2)", "openai (>=0,<1)", "openllm (>=0.1.19)", "openlm (>=0.0.5,<0.0.6)", "torch (>=1,<3)", "transformers (>=4,<5)", "xinference (>=0.0.6,<0.0.7)"]
openai = ["openai (>=0,<1)", "tiktoken (>=0.3.2,<0.4.0)"]
qdrant = ["qdrant-client (>=1.3.1,<2.0.0)"]
text-helpers = ["chardet (>=5.1.0,<6.0.0)"]
+[[package]]
+name = "langchain-experimental"
+version = "0.0.8"
+description = "Building applications with LLMs through composability"
+optional = false
+python-versions = ">=3.8.1,<4.0"
+files = [
+ {file = "langchain_experimental-0.0.8-py3-none-any.whl", hash = "sha256:34cf202ba29fdef178c5d68772cf2fa08dd2a0fad588ca3ef324ae71e596cc21"},
+ {file = "langchain_experimental-0.0.8.tar.gz", hash = "sha256:35d198f8e70a053ccd84273198ff08b4b700ac03ec5f43ba9b7ef797dd42ad14"},
+]
+
+[package.dependencies]
+langchain = ">=0.0.239"
+
[[package]]
name = "langchain-serve"
-version = "0.0.57"
+version = "0.0.60"
description = "Langchain Serve - serve your langchain apps on Jina AI Cloud."
-category = "main"
optional = true
python-versions = "*"
files = [
- {file = "langchain-serve-0.0.57.tar.gz", hash = "sha256:acd207c7c384232ae3d197b0ef0877d1afeb1fea89401722ab33fc6fb85ed2f5"},
+ {file = "langchain-serve-0.0.60.tar.gz", hash = "sha256:bfcc2e7c2a3cd3b4cde5ff45043cc9c8d437704941b02d166185d8334a120561"},
]
[package.dependencies]
click = "*"
-jcloud = ">=0.2.8,<=0.2.12"
+jcloud = ">=0.2.16"
jina = "3.15.2"
jina-hubble-sdk = "*"
langchain = "*"
@@ -2991,14 +2953,13 @@ test = ["psutil", "pytest", "pytest-asyncio"]
[[package]]
name = "langsmith"
-version = "0.0.14"
+version = "0.0.20"
description = "Client library to connect to the LangSmith LLM Tracing and Evaluation Platform."
-category = "main"
optional = false
python-versions = ">=3.8.1,<4.0"
files = [
- {file = "langsmith-0.0.14-py3-none-any.whl", hash = "sha256:d3c367fa72b88a226919e6c902e34d83791efd1250f284e84cf17eacc37558b9"},
- {file = "langsmith-0.0.14.tar.gz", hash = "sha256:146379f4ed8a7a28794c52b74009d40875371080a16a87bba95c410160b00b92"},
+ {file = "langsmith-0.0.20-py3-none-any.whl", hash = "sha256:c393a45f6da79c7dfbeb778eebce660020d1c26ec9579fe997c8455a6765e4b4"},
+ {file = "langsmith-0.0.20.tar.gz", hash = "sha256:825056a6ee4583e3dd473e7f47ef45e85841517d84162fbb164771aae4aa391d"},
]
[package.dependencies]
@@ -3009,7 +2970,6 @@ requests = ">=2,<3"
name = "linkify-it-py"
version = "2.0.2"
description = "Links recognition library with FULL unicode support."
-category = "main"
optional = true
python-versions = ">=3.7"
files = [
@@ -3028,13 +2988,12 @@ test = ["coverage", "pytest", "pytest-cov"]
[[package]]
name = "llama-cpp-python"
-version = "0.1.74"
+version = "0.1.77"
description = "A Python wrapper for llama.cpp"
-category = "main"
-optional = false
+optional = true
python-versions = ">=3.7"
files = [
- {file = "llama_cpp_python-0.1.74.tar.gz", hash = "sha256:406db14d9e1b32fccf0505c2aad74f349232aa860995663cfb2d3b52143c4376"},
+ {file = "llama_cpp_python-0.1.77.tar.gz", hash = "sha256:76c7fae8f5386edecf38cb149bf119127e1208883f0456c6998465648d6c242e"},
]
[package.dependencies]
@@ -3049,7 +3008,6 @@ server = ["fastapi (>=0.100.0)", "pydantic-settings (>=2.0.1)", "sse-starlette (
name = "loguru"
version = "0.7.0"
description = "Python logging made (stupidly) simple"
-category = "main"
optional = false
python-versions = ">=3.5"
files = [
@@ -3068,7 +3026,6 @@ dev = ["Sphinx (==5.3.0)", "colorama (==0.4.5)", "colorama (==0.4.6)", "freezegu
name = "lxml"
version = "4.9.3"
description = "Powerful and Pythonic XML processing library combining libxml2/libxslt with the ElementTree API."
-category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, != 3.4.*"
files = [
@@ -3165,7 +3122,6 @@ source = ["Cython (>=0.29.35)"]
name = "lz4"
version = "4.3.2"
description = "LZ4 Bindings for Python"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -3212,28 +3168,46 @@ flake8 = ["flake8"]
tests = ["psutil", "pytest (!=3.3.0)", "pytest-cov"]
[[package]]
-name = "markdown"
-version = "3.4.3"
-description = "Python implementation of John Gruber's Markdown."
-category = "main"
+name = "mako"
+version = "1.2.4"
+description = "A super-fast templating language that borrows the best ideas from the existing templating languages."
optional = false
python-versions = ">=3.7"
files = [
- {file = "Markdown-3.4.3-py3-none-any.whl", hash = "sha256:065fd4df22da73a625f14890dd77eb8040edcbd68794bcd35943be14490608b2"},
- {file = "Markdown-3.4.3.tar.gz", hash = "sha256:8bf101198e004dc93e84a12a7395e31aac6a9c9942848ae1d99b9d72cf9b3520"},
+ {file = "Mako-1.2.4-py3-none-any.whl", hash = "sha256:c97c79c018b9165ac9922ae4f32da095ffd3c4e6872b45eded42926deea46818"},
+ {file = "Mako-1.2.4.tar.gz", hash = "sha256:d60a3903dc3bb01a18ad6a89cdbe2e4eadc69c0bc8ef1e3773ba53d44c3f7a34"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=0.9.2"
+
+[package.extras]
+babel = ["Babel"]
+lingua = ["lingua"]
+testing = ["pytest"]
+
+[[package]]
+name = "markdown"
+version = "3.4.4"
+description = "Python implementation of John Gruber's Markdown."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Markdown-3.4.4-py3-none-any.whl", hash = "sha256:a4c1b65c0957b4bd9e7d86ddc7b3c9868fb9670660f6f99f6d1bca8954d5a941"},
+ {file = "Markdown-3.4.4.tar.gz", hash = "sha256:225c6123522495d4119a90b3a3ba31a1e87a70369e03f14799ea9c0d7183a3d6"},
]
[package.dependencies]
importlib-metadata = {version = ">=4.4", markers = "python_version < \"3.10\""}
[package.extras]
+docs = ["mdx-gh-links (>=0.2)", "mkdocs (>=1.0)", "mkdocs-nature (>=0.4)"]
testing = ["coverage", "pyyaml"]
[[package]]
name = "markdown-it-py"
version = "3.0.0"
description = "Python port of markdown-it. Markdown parsing, done right!"
-category = "main"
optional = false
python-versions = ">=3.8"
files = [
@@ -3260,7 +3234,6 @@ testing = ["coverage", "pytest", "pytest-cov", "pytest-regressions"]
name = "markupsafe"
version = "2.1.3"
description = "Safely add untrusted strings to HTML/XML markup."
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -3320,7 +3293,6 @@ files = [
name = "marshmallow"
version = "3.20.1"
description = "A lightweight library for converting complex datatypes to and from native Python datatypes."
-category = "main"
optional = false
python-versions = ">=3.8"
files = [
@@ -3341,7 +3313,6 @@ tests = ["pytest", "pytz", "simplejson"]
name = "matplotlib-inline"
version = "0.1.6"
description = "Inline Matplotlib backend for Jupyter"
-category = "dev"
optional = false
python-versions = ">=3.5"
files = [
@@ -3356,7 +3327,6 @@ traitlets = "*"
name = "mdit-py-plugins"
version = "0.4.0"
description = "Collection of plugins for markdown-it-py"
-category = "main"
optional = true
python-versions = ">=3.8"
files = [
@@ -3376,7 +3346,6 @@ testing = ["coverage", "pytest", "pytest-cov", "pytest-regressions"]
name = "mdurl"
version = "0.1.2"
description = "Markdown URL utilities"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -3384,11 +3353,24 @@ files = [
{file = "mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba"},
]
+[[package]]
+name = "metaphor-python"
+version = "0.1.11"
+description = "A Python package for the Metaphor API."
+optional = false
+python-versions = "*"
+files = [
+ {file = "metaphor-python-0.1.11.tar.gz", hash = "sha256:80fd993c44cc9d453d99eb65b95147d305f542fdd6fda699e3852e3100beb6ec"},
+ {file = "metaphor_python-0.1.11-py3-none-any.whl", hash = "sha256:0d759ecdf73492a4bafd404d0444935c172bcc4a89334f4f6780863ba488b238"},
+]
+
+[package.dependencies]
+requests = "*"
+
[[package]]
name = "monotonic"
version = "1.6"
description = "An implementation of time.monotonic() for Python 2 & < 3.3"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -3398,21 +3380,19 @@ files = [
[[package]]
name = "more-itertools"
-version = "9.1.0"
+version = "10.1.0"
description = "More routines for operating on iterables, beyond itertools"
-category = "main"
optional = false
-python-versions = ">=3.7"
+python-versions = ">=3.8"
files = [
- {file = "more-itertools-9.1.0.tar.gz", hash = "sha256:cabaa341ad0389ea83c17a94566a53ae4c9d07349861ecb14dc6d0345cf9ac5d"},
- {file = "more_itertools-9.1.0-py3-none-any.whl", hash = "sha256:d2bc7f02446e86a68911e58ded76d6561eea00cddfb2a91e7019bbb586c799f3"},
+ {file = "more-itertools-10.1.0.tar.gz", hash = "sha256:626c369fa0eb37bac0291bce8259b332fd59ac792fa5497b59837309cd5b114a"},
+ {file = "more_itertools-10.1.0-py3-none-any.whl", hash = "sha256:64e0735fcfdc6f3464ea133afe8ea4483b1c5fe3a3d69852e6503b43a0b222e6"},
]
[[package]]
name = "mpmath"
version = "1.3.0"
description = "Python library for arbitrary-precision floating-point arithmetic"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -3430,7 +3410,6 @@ tests = ["pytest (>=4.6)"]
name = "msg-parser"
version = "1.2.0"
description = "This module enables reading, parsing and converting Microsoft Outlook MSG E-Mail files."
-category = "main"
optional = false
python-versions = ">=3.4"
files = [
@@ -3448,7 +3427,6 @@ rtf = ["compressed-rtf (>=1.0.5)"]
name = "multidict"
version = "6.0.4"
description = "multidict implementation"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -3532,7 +3510,6 @@ files = [
name = "multiprocess"
version = "0.70.15"
description = "better multiprocessing and multithreading in Python"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -3559,38 +3536,33 @@ dill = ">=0.3.7"
[[package]]
name = "mypy"
-version = "1.4.1"
+version = "1.5.0"
description = "Optional static typing for Python"
-category = "dev"
optional = false
-python-versions = ">=3.7"
+python-versions = ">=3.8"
files = [
- {file = "mypy-1.4.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:566e72b0cd6598503e48ea610e0052d1b8168e60a46e0bfd34b3acf2d57f96a8"},
- {file = "mypy-1.4.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:ca637024ca67ab24a7fd6f65d280572c3794665eaf5edcc7e90a866544076878"},
- {file = "mypy-1.4.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0dde1d180cd84f0624c5dcaaa89c89775550a675aff96b5848de78fb11adabcd"},
- {file = "mypy-1.4.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:8c4d8e89aa7de683e2056a581ce63c46a0c41e31bd2b6d34144e2c80f5ea53dc"},
- {file = "mypy-1.4.1-cp310-cp310-win_amd64.whl", hash = "sha256:bfdca17c36ae01a21274a3c387a63aa1aafe72bff976522886869ef131b937f1"},
- {file = "mypy-1.4.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:7549fbf655e5825d787bbc9ecf6028731973f78088fbca3a1f4145c39ef09462"},
- {file = "mypy-1.4.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:98324ec3ecf12296e6422939e54763faedbfcc502ea4a4c38502082711867258"},
- {file = "mypy-1.4.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:141dedfdbfe8a04142881ff30ce6e6653c9685b354876b12e4fe6c78598b45e2"},
- {file = "mypy-1.4.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:8207b7105829eca6f3d774f64a904190bb2231de91b8b186d21ffd98005f14a7"},
- {file = "mypy-1.4.1-cp311-cp311-win_amd64.whl", hash = "sha256:16f0db5b641ba159eff72cff08edc3875f2b62b2fa2bc24f68c1e7a4e8232d01"},
- {file = "mypy-1.4.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:470c969bb3f9a9efcedbadcd19a74ffb34a25f8e6b0e02dae7c0e71f8372f97b"},
- {file = "mypy-1.4.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e5952d2d18b79f7dc25e62e014fe5a23eb1a3d2bc66318df8988a01b1a037c5b"},
- {file = "mypy-1.4.1-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:190b6bab0302cec4e9e6767d3eb66085aef2a1cc98fe04936d8a42ed2ba77bb7"},
- {file = "mypy-1.4.1-cp37-cp37m-win_amd64.whl", hash = "sha256:9d40652cc4fe33871ad3338581dca3297ff5f2213d0df345bcfbde5162abf0c9"},
- {file = "mypy-1.4.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:01fd2e9f85622d981fd9063bfaef1aed6e336eaacca00892cd2d82801ab7c042"},
- {file = "mypy-1.4.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:2460a58faeea905aeb1b9b36f5065f2dc9a9c6e4c992a6499a2360c6c74ceca3"},
- {file = "mypy-1.4.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a2746d69a8196698146a3dbe29104f9eb6a2a4d8a27878d92169a6c0b74435b6"},
- {file = "mypy-1.4.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:ae704dcfaa180ff7c4cfbad23e74321a2b774f92ca77fd94ce1049175a21c97f"},
- {file = "mypy-1.4.1-cp38-cp38-win_amd64.whl", hash = "sha256:43d24f6437925ce50139a310a64b2ab048cb2d3694c84c71c3f2a1626d8101dc"},
- {file = "mypy-1.4.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:c482e1246726616088532b5e964e39765b6d1520791348e6c9dc3af25b233828"},
- {file = "mypy-1.4.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:43b592511672017f5b1a483527fd2684347fdffc041c9ef53428c8dc530f79a3"},
- {file = "mypy-1.4.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:34a9239d5b3502c17f07fd7c0b2ae6b7dd7d7f6af35fbb5072c6208e76295816"},
- {file = "mypy-1.4.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:5703097c4936bbb9e9bce41478c8d08edd2865e177dc4c52be759f81ee4dd26c"},
- {file = "mypy-1.4.1-cp39-cp39-win_amd64.whl", hash = "sha256:e02d700ec8d9b1859790c0475df4e4092c7bf3272a4fd2c9f33d87fac4427b8f"},
- {file = "mypy-1.4.1-py3-none-any.whl", hash = "sha256:45d32cec14e7b97af848bddd97d85ea4f0db4d5a149ed9676caa4eb2f7402bb4"},
- {file = "mypy-1.4.1.tar.gz", hash = "sha256:9bbcd9ab8ea1f2e1c8031c21445b511442cc45c89951e49bbf852cbb70755b1b"},
+ {file = "mypy-1.5.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:ad3109bec37cc33654de8db30fe8ff3a1bb57ea65144167d68185e6dced9868d"},
+ {file = "mypy-1.5.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:b4ea3a0241cb005b0ccdbd318fb99619b21ae51bcf1660b95fc22e0e7d3ba4a1"},
+ {file = "mypy-1.5.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1fe816e26e676c1311b9e04fd576543b873576d39439f7c24c8e5c7728391ecf"},
+ {file = "mypy-1.5.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:42170e68adb1603ccdc55a30068f72bcfcde2ce650188e4c1b2a93018b826735"},
+ {file = "mypy-1.5.0-cp310-cp310-win_amd64.whl", hash = "sha256:d145b81a8214687cfc1f85c03663a5bbe736777410e5580e54d526e7e904f564"},
+ {file = "mypy-1.5.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:c36011320e452eb30bec38b9fd3ba20569dc9545d7d4540d967f3ea1fab9c374"},
+ {file = "mypy-1.5.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f3940cf5845b2512b3ab95463198b0cdf87975dfd17fdcc6ce9709a9abe09e69"},
+ {file = "mypy-1.5.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9166186c498170e1ff478a7f540846b2169243feb95bc228d39a67a1a450cdc6"},
+ {file = "mypy-1.5.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:725b57a19b7408ef66a0fd9db59b5d3e528922250fb56e50bded27fea9ff28f0"},
+ {file = "mypy-1.5.0-cp311-cp311-win_amd64.whl", hash = "sha256:eec5c927aa4b3e8b4781840f1550079969926d0a22ce38075f6cfcf4b13e3eb4"},
+ {file = "mypy-1.5.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:79c520aa24f21852206b5ff2cf746dc13020113aa73fa55af504635a96e62718"},
+ {file = "mypy-1.5.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:769ddb6bfe55c2bd9c7d6d7020885a5ea14289619db7ee650e06b1ef0852c6f4"},
+ {file = "mypy-1.5.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cbf18f8db7e5f060d61c91e334d3b96d6bb624ddc9ee8a1cde407b737acbca2c"},
+ {file = "mypy-1.5.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:a2500ad063413bc873ae102cf655bf49889e0763b260a3a7cf544a0cbbf7e70a"},
+ {file = "mypy-1.5.0-cp38-cp38-win_amd64.whl", hash = "sha256:84cf9f7d8a8a22bb6a36444480f4cbf089c917a4179fbf7eea003ea931944a7f"},
+ {file = "mypy-1.5.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:a551ed0fc02455fe2c1fb0145160df8336b90ab80224739627b15ebe2b45e9dc"},
+ {file = "mypy-1.5.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:372fd97293ed0076d52695849f59acbbb8461c4ab447858cdaeaf734a396d823"},
+ {file = "mypy-1.5.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c8a7444d6fcac7e2585b10abb91ad900a576da7af8f5cffffbff6065d9115813"},
+ {file = "mypy-1.5.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:35b13335c6c46a386577a51f3d38b2b5d14aa619e9633bb756bd77205e4bd09f"},
+ {file = "mypy-1.5.0-cp39-cp39-win_amd64.whl", hash = "sha256:2c9d570f53908cbea326ad8f96028a673b814d9dca7515bf71d95fa662c3eb6f"},
+ {file = "mypy-1.5.0-py3-none-any.whl", hash = "sha256:69b32d0dedd211b80f1b7435644e1ef83033a2af2ac65adcdc87c38db68a86be"},
+ {file = "mypy-1.5.0.tar.gz", hash = "sha256:f3460f34b3839b9bc84ee3ed65076eb827cd99ed13ed08d723f9083cada4a212"},
]
[package.dependencies]
@@ -3601,14 +3573,12 @@ typing-extensions = ">=4.1.0"
[package.extras]
dmypy = ["psutil (>=4.0)"]
install-types = ["pip"]
-python2 = ["typed-ast (>=1.4.0,<2)"]
reports = ["lxml"]
[[package]]
name = "mypy-extensions"
version = "1.0.0"
description = "Type system extensions for programs checked with the mypy type checker."
-category = "main"
optional = false
python-versions = ">=3.5"
files = [
@@ -3618,21 +3588,19 @@ files = [
[[package]]
name = "nest-asyncio"
-version = "1.5.6"
+version = "1.5.7"
description = "Patch asyncio to allow nested event loops"
-category = "main"
optional = false
python-versions = ">=3.5"
files = [
- {file = "nest_asyncio-1.5.6-py3-none-any.whl", hash = "sha256:b9a953fb40dceaa587d109609098db21900182b16440652454a146cffb06e8b8"},
- {file = "nest_asyncio-1.5.6.tar.gz", hash = "sha256:d267cc1ff794403f7df692964d1d2a3fa9418ffea2a3f6859a439ff482fef290"},
+ {file = "nest_asyncio-1.5.7-py3-none-any.whl", hash = "sha256:5301c82941b550b3123a1ea772ba9a1c80bad3a182be8c1a5ae6ad3be57a9657"},
+ {file = "nest_asyncio-1.5.7.tar.gz", hash = "sha256:6a80f7b98f24d9083ed24608977c09dd608d83f91cccc24c9d2cba6d10e01c10"},
]
[[package]]
name = "networkx"
version = "3.1"
description = "Python package for creating and manipulating graphs and networks"
-category = "main"
optional = false
python-versions = ">=3.8"
files = [
@@ -3651,7 +3619,6 @@ test = ["codecov (>=2.1)", "pytest (>=7.2)", "pytest-cov (>=4.0)"]
name = "nltk"
version = "3.8.1"
description = "Natural Language Toolkit"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -3675,42 +3642,41 @@ twitter = ["twython"]
[[package]]
name = "numexpr"
-version = "2.8.4"
+version = "2.8.5"
description = "Fast numerical expression evaluator for NumPy"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "numexpr-2.8.4-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:a75967d46b6bd56455dd32da6285e5ffabe155d0ee61eef685bbfb8dafb2e484"},
- {file = "numexpr-2.8.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:db93cf1842f068247de631bfc8af20118bf1f9447cd929b531595a5e0efc9346"},
- {file = "numexpr-2.8.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7bca95f4473b444428061d4cda8e59ac564dc7dc6a1dea3015af9805c6bc2946"},
- {file = "numexpr-2.8.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9e34931089a6bafc77aaae21f37ad6594b98aa1085bb8b45d5b3cd038c3c17d9"},
- {file = "numexpr-2.8.4-cp310-cp310-win32.whl", hash = "sha256:f3a920bfac2645017110b87ddbe364c9c7a742870a4d2f6120b8786c25dc6db3"},
- {file = "numexpr-2.8.4-cp310-cp310-win_amd64.whl", hash = "sha256:6931b1e9d4f629f43c14b21d44f3f77997298bea43790cfcdb4dd98804f90783"},
- {file = "numexpr-2.8.4-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:9400781553541f414f82eac056f2b4c965373650df9694286b9bd7e8d413f8d8"},
- {file = "numexpr-2.8.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:6ee9db7598dd4001138b482342b96d78110dd77cefc051ec75af3295604dde6a"},
- {file = "numexpr-2.8.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ff5835e8af9a212e8480003d731aad1727aaea909926fd009e8ae6a1cba7f141"},
- {file = "numexpr-2.8.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:655d84eb09adfee3c09ecf4a89a512225da153fdb7de13c447404b7d0523a9a7"},
- {file = "numexpr-2.8.4-cp311-cp311-win32.whl", hash = "sha256:5538b30199bfc68886d2be18fcef3abd11d9271767a7a69ff3688defe782800a"},
- {file = "numexpr-2.8.4-cp311-cp311-win_amd64.whl", hash = "sha256:3f039321d1c17962c33079987b675fb251b273dbec0f51aac0934e932446ccc3"},
- {file = "numexpr-2.8.4-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:c867cc36cf815a3ec9122029874e00d8fbcef65035c4a5901e9b120dd5d626a2"},
- {file = "numexpr-2.8.4-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:059546e8f6283ccdb47c683101a890844f667fa6d56258d48ae2ecf1b3875957"},
- {file = "numexpr-2.8.4-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:845a6aa0ed3e2a53239b89c1ebfa8cf052d3cc6e053c72805e8153300078c0b1"},
- {file = "numexpr-2.8.4-cp37-cp37m-win32.whl", hash = "sha256:a38664e699526cb1687aefd9069e2b5b9387da7feac4545de446141f1ef86f46"},
- {file = "numexpr-2.8.4-cp37-cp37m-win_amd64.whl", hash = "sha256:eaec59e9bf70ff05615c34a8b8d6c7bd042bd9f55465d7b495ea5436f45319d0"},
- {file = "numexpr-2.8.4-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:b318541bf3d8326682ebada087ba0050549a16d8b3fa260dd2585d73a83d20a7"},
- {file = "numexpr-2.8.4-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:b076db98ca65eeaf9bd224576e3ac84c05e451c0bd85b13664b7e5f7b62e2c70"},
- {file = "numexpr-2.8.4-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:90f12cc851240f7911a47c91aaf223dba753e98e46dff3017282e633602e76a7"},
- {file = "numexpr-2.8.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c368aa35ae9b18840e78b05f929d3a7b3abccdba9630a878c7db74ca2368339"},
- {file = "numexpr-2.8.4-cp38-cp38-win32.whl", hash = "sha256:b96334fc1748e9ec4f93d5fadb1044089d73fb08208fdb8382ed77c893f0be01"},
- {file = "numexpr-2.8.4-cp38-cp38-win_amd64.whl", hash = "sha256:a6d2d7740ae83ba5f3531e83afc4b626daa71df1ef903970947903345c37bd03"},
- {file = "numexpr-2.8.4-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:77898fdf3da6bb96aa8a4759a8231d763a75d848b2f2e5c5279dad0b243c8dfe"},
- {file = "numexpr-2.8.4-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:df35324666b693f13a016bc7957de7cc4d8801b746b81060b671bf78a52b9037"},
- {file = "numexpr-2.8.4-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:17ac9cfe6d0078c5fc06ba1c1bbd20b8783f28c6f475bbabd3cad53683075cab"},
- {file = "numexpr-2.8.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:df3a1f6b24214a1ab826e9c1c99edf1686c8e307547a9aef33910d586f626d01"},
- {file = "numexpr-2.8.4-cp39-cp39-win32.whl", hash = "sha256:7d71add384adc9119568d7e9ffa8a35b195decae81e0abf54a2b7779852f0637"},
- {file = "numexpr-2.8.4-cp39-cp39-win_amd64.whl", hash = "sha256:9f096d707290a6a00b6ffdaf581ee37331109fb7b6c8744e9ded7c779a48e517"},
- {file = "numexpr-2.8.4.tar.gz", hash = "sha256:d5432537418d18691b9115d615d6daa17ee8275baef3edf1afbbf8bc69806147"},
+ {file = "numexpr-2.8.5-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:51f3ab160c3847ebcca93cd88f935a7802b54a01ab63fe93152994a64d7a6cf2"},
+ {file = "numexpr-2.8.5-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:de29c77f674e4eb8f0846525a475cab64008c227c8bc4ba5153ab3f72441cc63"},
+ {file = "numexpr-2.8.5-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bf85ba1327eb87ec82ae7936f13c8850fb969a0ca34f3ba9fa3897c09d5c80d7"},
+ {file = "numexpr-2.8.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3c00be69f747f44a631830215cab482f0f77f75af2925695adff57c1cc0f9a68"},
+ {file = "numexpr-2.8.5-cp310-cp310-win32.whl", hash = "sha256:c46350dcdb93e32f033eea5a21269514ffcaf501d9abd6036992d37e48a308b0"},
+ {file = "numexpr-2.8.5-cp310-cp310-win_amd64.whl", hash = "sha256:894b027438b8ec88dea32a19193716c79f4ff8ddb92302dcc9731b51ba3565a8"},
+ {file = "numexpr-2.8.5-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6df184d40d4cf9f21c71f429962f39332f7398147762588c9f3a5c77065d0c06"},
+ {file = "numexpr-2.8.5-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:178b85ad373c6903e55d75787d61b92380439b70d94b001cb055a501b0821335"},
+ {file = "numexpr-2.8.5-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:578fe4008e4d5d6ff01bbeb2d7b7ba1ec658a5cda9c720cd26a9a8325f8ef438"},
+ {file = "numexpr-2.8.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ef621b4ee366a5c6a484f6678c9259f5b826569f8bfa0b89ba2306d5055468bb"},
+ {file = "numexpr-2.8.5-cp311-cp311-win32.whl", hash = "sha256:dd57ab1a3d3aaa9274aff1cefbf93b8ddacc7973afef5b125905f6bf18fabab0"},
+ {file = "numexpr-2.8.5-cp311-cp311-win_amd64.whl", hash = "sha256:783324ba40eb804ecfc9ebae86120a1e339ab112d0ab8a1f0d48a26354d5bf9b"},
+ {file = "numexpr-2.8.5-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:183d5430db76826e54465c69db93a3c6ecbf03cda5aa1bb96eaad0147e9b68dc"},
+ {file = "numexpr-2.8.5-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:39ce106f92ccea5b07b1d6f2f3c4370f05edf27691dc720a63903484a2137e48"},
+ {file = "numexpr-2.8.5-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b594dc9e2d6291a0bc5c065e6d9caf3eee743b5663897832e9b17753c002947a"},
+ {file = "numexpr-2.8.5-cp37-cp37m-win32.whl", hash = "sha256:62b4faf8e0627673b0210a837792bddd23050ecebc98069ab23eb0633ff1ef5f"},
+ {file = "numexpr-2.8.5-cp37-cp37m-win_amd64.whl", hash = "sha256:db5c65417d69414f1ab31302ea01d3548303ef31209c38b4849d145be4e1d1ba"},
+ {file = "numexpr-2.8.5-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:eb36ffcfa1606e41aa08d559b4277bcad0e16b83941d1a4fee8d2bd5a34f8e0e"},
+ {file = "numexpr-2.8.5-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:34af2a0e857d02a4bc5758bc037a777d50dacb13bcd57c7905268a3e44994ed6"},
+ {file = "numexpr-2.8.5-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5a8dad2bfaad5a5c34a2e8bbf62b9df1dfab266d345fda1feb20ff4e264b347a"},
+ {file = "numexpr-2.8.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b93f5a866cd13a808bc3d3a9c487d94cd02eec408b275ff0aa150f2e8e5191f8"},
+ {file = "numexpr-2.8.5-cp38-cp38-win32.whl", hash = "sha256:558390fea6370003ac749ed9d0f38d708aa096f5dcb707ddb6e0ca5a0dd37da1"},
+ {file = "numexpr-2.8.5-cp38-cp38-win_amd64.whl", hash = "sha256:55983806815035eb63c5039520688c49536bb7f3cc3fc1d7d64c6a00cf3f353e"},
+ {file = "numexpr-2.8.5-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:1510da20e6f5f45333610b1ded44c566e2690c6c437c84f2a212ca09627c7e01"},
+ {file = "numexpr-2.8.5-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:9e8b5bf7bcb4e8dcd66522d8fc96e1db7278f901cb4fd2e155efbe62a41dde08"},
+ {file = "numexpr-2.8.5-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4ed0e1c1ef5f34381448539f1fe9015906d21c9cfa2797c06194d4207dadb465"},
+ {file = "numexpr-2.8.5-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aea6ab45c87c0a7041183c08a798f0ad4d7c5eccbce20cfe79ce6f1a45ef3702"},
+ {file = "numexpr-2.8.5-cp39-cp39-win32.whl", hash = "sha256:cbfd833ee5fdb0efb862e152aee7e6ccea9c596d5c11d22604c2e6307bff7cad"},
+ {file = "numexpr-2.8.5-cp39-cp39-win_amd64.whl", hash = "sha256:283ce8609a7ccbadf91a68f3484558b3e36d27c93c98a41ec205efb0ab43c872"},
+ {file = "numexpr-2.8.5.tar.gz", hash = "sha256:45ed41e55a0abcecf3d711481e12a5fb7a904fe99d42bc282a17cc5f8ea510be"},
]
[package.dependencies]
@@ -3718,44 +3684,42 @@ numpy = ">=1.13.3"
[[package]]
name = "numpy"
-version = "1.25.1"
+version = "1.25.2"
description = "Fundamental package for array computing in Python"
-category = "main"
optional = false
python-versions = ">=3.9"
files = [
- {file = "numpy-1.25.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:77d339465dff3eb33c701430bcb9c325b60354698340229e1dff97745e6b3efa"},
- {file = "numpy-1.25.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d736b75c3f2cb96843a5c7f8d8ccc414768d34b0a75f466c05f3a739b406f10b"},
- {file = "numpy-1.25.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4a90725800caeaa160732d6b31f3f843ebd45d6b5f3eec9e8cc287e30f2805bf"},
- {file = "numpy-1.25.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c6c9261d21e617c6dc5eacba35cb68ec36bb72adcff0dee63f8fbc899362588"},
- {file = "numpy-1.25.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:0def91f8af6ec4bb94c370e38c575855bf1d0be8a8fbfba42ef9c073faf2cf19"},
- {file = "numpy-1.25.1-cp310-cp310-win32.whl", hash = "sha256:fd67b306320dcadea700a8f79b9e671e607f8696e98ec255915c0c6d6b818503"},
- {file = "numpy-1.25.1-cp310-cp310-win_amd64.whl", hash = "sha256:c1516db588987450b85595586605742879e50dcce923e8973f79529651545b57"},
- {file = "numpy-1.25.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6b82655dd8efeea69dbf85d00fca40013d7f503212bc5259056244961268b66e"},
- {file = "numpy-1.25.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e8f6049c4878cb16960fbbfb22105e49d13d752d4d8371b55110941fb3b17800"},
- {file = "numpy-1.25.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:41a56b70e8139884eccb2f733c2f7378af06c82304959e174f8e7370af112e09"},
- {file = "numpy-1.25.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d5154b1a25ec796b1aee12ac1b22f414f94752c5f94832f14d8d6c9ac40bcca6"},
- {file = "numpy-1.25.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:38eb6548bb91c421261b4805dc44def9ca1a6eef6444ce35ad1669c0f1a3fc5d"},
- {file = "numpy-1.25.1-cp311-cp311-win32.whl", hash = "sha256:791f409064d0a69dd20579345d852c59822c6aa087f23b07b1b4e28ff5880fcb"},
- {file = "numpy-1.25.1-cp311-cp311-win_amd64.whl", hash = "sha256:c40571fe966393b212689aa17e32ed905924120737194b5d5c1b20b9ed0fb171"},
- {file = "numpy-1.25.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:3d7abcdd85aea3e6cdddb59af2350c7ab1ed764397f8eec97a038ad244d2d105"},
- {file = "numpy-1.25.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:1a180429394f81c7933634ae49b37b472d343cccb5bb0c4a575ac8bbc433722f"},
- {file = "numpy-1.25.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d412c1697c3853c6fc3cb9751b4915859c7afe6a277c2bf00acf287d56c4e625"},
- {file = "numpy-1.25.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:20e1266411120a4f16fad8efa8e0454d21d00b8c7cee5b5ccad7565d95eb42dd"},
- {file = "numpy-1.25.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:f76aebc3358ade9eacf9bc2bb8ae589863a4f911611694103af05346637df1b7"},
- {file = "numpy-1.25.1-cp39-cp39-win32.whl", hash = "sha256:247d3ffdd7775bdf191f848be8d49100495114c82c2bd134e8d5d075fb386a1c"},
- {file = "numpy-1.25.1-cp39-cp39-win_amd64.whl", hash = "sha256:1d5d3c68e443c90b38fdf8ef40e60e2538a27548b39b12b73132456847f4b631"},
- {file = "numpy-1.25.1-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:35a9527c977b924042170a0887de727cd84ff179e478481404c5dc66b4170009"},
- {file = "numpy-1.25.1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0d3fe3dd0506a28493d82dc3cf254be8cd0d26f4008a417385cbf1ae95b54004"},
- {file = "numpy-1.25.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:012097b5b0d00a11070e8f2e261128c44157a8689f7dedcf35576e525893f4fe"},
- {file = "numpy-1.25.1.tar.gz", hash = "sha256:9a3a9f3a61480cc086117b426a8bd86869c213fc4072e606f01c4e4b66eb92bf"},
+ {file = "numpy-1.25.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:db3ccc4e37a6873045580d413fe79b68e47a681af8db2e046f1dacfa11f86eb3"},
+ {file = "numpy-1.25.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:90319e4f002795ccfc9050110bbbaa16c944b1c37c0baeea43c5fb881693ae1f"},
+ {file = "numpy-1.25.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dfe4a913e29b418d096e696ddd422d8a5d13ffba4ea91f9f60440a3b759b0187"},
+ {file = "numpy-1.25.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f08f2e037bba04e707eebf4bc934f1972a315c883a9e0ebfa8a7756eabf9e357"},
+ {file = "numpy-1.25.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:bec1e7213c7cb00d67093247f8c4db156fd03075f49876957dca4711306d39c9"},
+ {file = "numpy-1.25.2-cp310-cp310-win32.whl", hash = "sha256:7dc869c0c75988e1c693d0e2d5b26034644399dd929bc049db55395b1379e044"},
+ {file = "numpy-1.25.2-cp310-cp310-win_amd64.whl", hash = "sha256:834b386f2b8210dca38c71a6e0f4fd6922f7d3fcff935dbe3a570945acb1b545"},
+ {file = "numpy-1.25.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:c5462d19336db4560041517dbb7759c21d181a67cb01b36ca109b2ae37d32418"},
+ {file = "numpy-1.25.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:c5652ea24d33585ea39eb6a6a15dac87a1206a692719ff45d53c5282e66d4a8f"},
+ {file = "numpy-1.25.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0d60fbae8e0019865fc4784745814cff1c421df5afee233db6d88ab4f14655a2"},
+ {file = "numpy-1.25.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:60e7f0f7f6d0eee8364b9a6304c2845b9c491ac706048c7e8cf47b83123b8dbf"},
+ {file = "numpy-1.25.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:bb33d5a1cf360304754913a350edda36d5b8c5331a8237268c48f91253c3a364"},
+ {file = "numpy-1.25.2-cp311-cp311-win32.whl", hash = "sha256:5883c06bb92f2e6c8181df7b39971a5fb436288db58b5a1c3967702d4278691d"},
+ {file = "numpy-1.25.2-cp311-cp311-win_amd64.whl", hash = "sha256:5c97325a0ba6f9d041feb9390924614b60b99209a71a69c876f71052521d42a4"},
+ {file = "numpy-1.25.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:b79e513d7aac42ae918db3ad1341a015488530d0bb2a6abcbdd10a3a829ccfd3"},
+ {file = "numpy-1.25.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:eb942bfb6f84df5ce05dbf4b46673ffed0d3da59f13635ea9b926af3deb76926"},
+ {file = "numpy-1.25.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3e0746410e73384e70d286f93abf2520035250aad8c5714240b0492a7302fdca"},
+ {file = "numpy-1.25.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d7806500e4f5bdd04095e849265e55de20d8cc4b661b038957354327f6d9b295"},
+ {file = "numpy-1.25.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:8b77775f4b7df768967a7c8b3567e309f617dd5e99aeb886fa14dc1a0791141f"},
+ {file = "numpy-1.25.2-cp39-cp39-win32.whl", hash = "sha256:2792d23d62ec51e50ce4d4b7d73de8f67a2fd3ea710dcbc8563a51a03fb07b01"},
+ {file = "numpy-1.25.2-cp39-cp39-win_amd64.whl", hash = "sha256:76b4115d42a7dfc5d485d358728cdd8719be33cc5ec6ec08632a5d6fca2ed380"},
+ {file = "numpy-1.25.2-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:1a1329e26f46230bf77b02cc19e900db9b52f398d6722ca853349a782d4cff55"},
+ {file = "numpy-1.25.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4c3abc71e8b6edba80a01a52e66d83c5d14433cbcd26a40c329ec7ed09f37901"},
+ {file = "numpy-1.25.2-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:1b9735c27cea5d995496f46a8b1cd7b408b3f34b6d50459d9ac8fe3a20cc17bf"},
+ {file = "numpy-1.25.2.tar.gz", hash = "sha256:fd608e19c8d7c55021dffd43bfe5492fab8cc105cc8986f813f8c3c048b38760"},
]
[[package]]
name = "olefile"
version = "0.46"
description = "Python package to parse, read and write Microsoft OLE2 files (Structured Storage or Compound Document, Microsoft Office)"
-category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
files = [
@@ -3766,7 +3730,6 @@ files = [
name = "onnxruntime"
version = "1.15.1"
description = "ONNX Runtime is a runtime accelerator for Machine Learning models"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -3808,7 +3771,6 @@ sympy = "*"
name = "openai"
version = "0.27.8"
description = "Python client library for the OpenAI API"
-category = "main"
optional = false
python-versions = ">=3.7.1"
files = [
@@ -3823,7 +3785,7 @@ tqdm = "*"
[package.extras]
datalib = ["numpy", "openpyxl (>=3.0.7)", "pandas (>=1.2.3)", "pandas-stubs (>=1.1.0.11)"]
-dev = ["black (>=21.6b0,<22.0)", "pytest (>=6.0.0,<7.0.0)", "pytest-asyncio", "pytest-mock"]
+dev = ["black (>=21.6b0,<22.0)", "pytest (==6.*)", "pytest-asyncio", "pytest-mock"]
embeddings = ["matplotlib", "numpy", "openpyxl (>=3.0.7)", "pandas (>=1.2.3)", "pandas-stubs (>=1.1.0.11)", "plotly", "scikit-learn (>=1.0.2)", "scipy", "tenacity (>=8.0.1)"]
wandb = ["numpy", "openpyxl (>=3.0.7)", "pandas (>=1.2.3)", "pandas-stubs (>=1.1.0.11)", "wandb"]
@@ -3831,7 +3793,6 @@ wandb = ["numpy", "openpyxl (>=3.0.7)", "pandas (>=1.2.3)", "pandas-stubs (>=1.1
name = "openapi-schema-pydantic"
version = "1.2.4"
description = "OpenAPI (v3) specification schema as pydantic class"
-category = "main"
optional = false
python-versions = ">=3.6.1"
files = [
@@ -3846,7 +3807,6 @@ pydantic = ">=1.8.2"
name = "openpyxl"
version = "3.1.2"
description = "A Python library to read/write Excel 2010 xlsx/xlsm files"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -3861,7 +3821,6 @@ et-xmlfile = "*"
name = "opentelemetry-api"
version = "1.19.0"
description = "OpenTelemetry Python API"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -3877,7 +3836,6 @@ importlib-metadata = ">=6.0,<7.0"
name = "opentelemetry-exporter-otlp"
version = "1.19.0"
description = "OpenTelemetry Collector Exporters"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -3893,7 +3851,6 @@ opentelemetry-exporter-otlp-proto-http = "1.19.0"
name = "opentelemetry-exporter-otlp-proto-common"
version = "1.19.0"
description = "OpenTelemetry Protobuf encoding"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -3908,7 +3865,6 @@ opentelemetry-proto = "1.19.0"
name = "opentelemetry-exporter-otlp-proto-grpc"
version = "1.19.0"
description = "OpenTelemetry Collector Protobuf over gRPC Exporter"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -3933,7 +3889,6 @@ test = ["pytest-grpc"]
name = "opentelemetry-exporter-otlp-proto-http"
version = "1.19.0"
description = "OpenTelemetry Collector Protobuf over HTTP Exporter"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -3958,7 +3913,6 @@ test = ["responses (==0.22.0)"]
name = "opentelemetry-exporter-prometheus"
version = "1.12.0rc1"
description = "Prometheus Metric Exporter for OpenTelemetry"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -3975,7 +3929,6 @@ prometheus-client = ">=0.5.0,<1.0.0"
name = "opentelemetry-instrumentation"
version = "0.40b0"
description = "Instrumentation Tools & Auto Instrumentation for OpenTelemetry Python"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -3992,7 +3945,6 @@ wrapt = ">=1.0.0,<2.0.0"
name = "opentelemetry-instrumentation-aiohttp-client"
version = "0.40b0"
description = "OpenTelemetry aiohttp client instrumentation"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -4015,7 +3967,6 @@ test = ["http-server-mock", "opentelemetry-instrumentation-aiohttp-client[instru
name = "opentelemetry-instrumentation-asgi"
version = "0.40b0"
description = "ASGI instrumentation for OpenTelemetry"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -4038,7 +3989,6 @@ test = ["opentelemetry-instrumentation-asgi[instruments]", "opentelemetry-test-u
name = "opentelemetry-instrumentation-fastapi"
version = "0.40b0"
description = "OpenTelemetry FastAPI Instrumentation"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -4061,7 +4011,6 @@ test = ["httpx (>=0.22,<1.0)", "opentelemetry-instrumentation-fastapi[instrument
name = "opentelemetry-instrumentation-grpc"
version = "0.40b0"
description = "OpenTelemetry gRPC instrumentation"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -4084,7 +4033,6 @@ test = ["opentelemetry-instrumentation-grpc[instruments]", "opentelemetry-sdk (>
name = "opentelemetry-proto"
version = "1.19.0"
description = "OpenTelemetry Python Proto"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -4099,7 +4047,6 @@ protobuf = ">=3.19,<5.0"
name = "opentelemetry-sdk"
version = "1.19.0"
description = "OpenTelemetry Python SDK"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -4116,7 +4063,6 @@ typing-extensions = ">=3.7.4"
name = "opentelemetry-semantic-conventions"
version = "0.40b0"
description = "OpenTelemetry Semantic Conventions"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -4128,7 +4074,6 @@ files = [
name = "opentelemetry-util-http"
version = "0.40b0"
description = "Web util for OpenTelemetry"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -4138,77 +4083,84 @@ files = [
[[package]]
name = "orjson"
-version = "3.9.2"
+version = "3.9.3"
description = "Fast, correct Python JSON library supporting dataclasses, datetimes, and numpy"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "orjson-3.9.2-cp310-cp310-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:7323e4ca8322b1ecb87562f1ec2491831c086d9faa9a6c6503f489dadbed37d7"},
- {file = "orjson-3.9.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1272688ea1865f711b01ba479dea2d53e037ea00892fd04196b5875f7021d9d3"},
- {file = "orjson-3.9.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:0b9a26f1d1427a9101a1e8910f2e2df1f44d3d18ad5480ba031b15d5c1cb282e"},
- {file = "orjson-3.9.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6a5ca55b0d8f25f18b471e34abaee4b175924b6cd62f59992945b25963443141"},
- {file = "orjson-3.9.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:877872db2c0f41fbe21f852ff642ca842a43bc34895b70f71c9d575df31fffb4"},
- {file = "orjson-3.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4a39c2529d75373b7167bf84c814ef9b8f3737a339c225ed6c0df40736df8748"},
- {file = "orjson-3.9.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:84ebd6fdf138eb0eb4280045442331ee71c0aab5e16397ba6645f32f911bfb37"},
- {file = "orjson-3.9.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:5a60a1cfcfe310547a1946506dd4f1ed0a7d5bd5b02c8697d9d5dcd8d2e9245e"},
- {file = "orjson-3.9.2-cp310-none-win_amd64.whl", hash = "sha256:c290c4f81e8fd0c1683638802c11610b2f722b540f8e5e858b6914b495cf90c8"},
- {file = "orjson-3.9.2-cp311-cp311-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:02ef014f9a605e84b675060785e37ec9c0d2347a04f1307a9d6840ab8ecd6f55"},
- {file = "orjson-3.9.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:992af54265ada1c1579500d6594ed73fe333e726de70d64919cf37f93defdd06"},
- {file = "orjson-3.9.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a40958f7af7c6d992ee67b2da4098dca8b770fc3b4b3834d540477788bfa76d3"},
- {file = "orjson-3.9.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:93864dec3e3dd058a2dbe488d11ac0345214a6a12697f53a63e34de7d28d4257"},
- {file = "orjson-3.9.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:16fdf5a82df80c544c3c91516ab3882cd1ac4f1f84eefeafa642e05cef5f6699"},
- {file = "orjson-3.9.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:275b5a18fd9ed60b2720543d3ddac170051c43d680e47d04ff5203d2c6d8ebf1"},
- {file = "orjson-3.9.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:b9aea6dcb99fcbc9f6d1dd84fca92322fda261da7fb014514bb4689c7c2097a8"},
- {file = "orjson-3.9.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:7d74ae0e101d17c22ef67b741ba356ab896fc0fa64b301c2bf2bb0a4d874b190"},
- {file = "orjson-3.9.2-cp311-none-win_amd64.whl", hash = "sha256:6320b28e7bdb58c3a3a5efffe04b9edad3318d82409e84670a9b24e8035a249d"},
- {file = "orjson-3.9.2-cp37-cp37m-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:368e9cc91ecb7ac21f2aa475e1901204110cf3e714e98649c2502227d248f947"},
- {file = "orjson-3.9.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:58e9e70f0dcd6a802c35887f306b555ff7a214840aad7de24901fc8bd9cf5dde"},
- {file = "orjson-3.9.2-cp37-cp37m-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:00c983896c2e01c94c0ef72fd7373b2aa06d0c0eed0342c4884559f812a6835b"},
- {file = "orjson-3.9.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2ee743e8890b16c87a2f89733f983370672272b61ee77429c0a5899b2c98c1a7"},
- {file = "orjson-3.9.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b7b065942d362aad4818ff599d2f104c35a565c2cbcbab8c09ec49edba91da75"},
- {file = "orjson-3.9.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e46e9c5b404bb9e41d5555762fd410d5466b7eb1ec170ad1b1609cbebe71df21"},
- {file = "orjson-3.9.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:8170157288714678ffd64f5de33039e1164a73fd8b6be40a8a273f80093f5c4f"},
- {file = "orjson-3.9.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:e3e2f087161947dafe8319ea2cfcb9cea4bb9d2172ecc60ac3c9738f72ef2909"},
- {file = "orjson-3.9.2-cp37-none-win_amd64.whl", hash = "sha256:d7de3dbbe74109ae598692113cec327fd30c5a30ebca819b21dfa4052f7b08ef"},
- {file = "orjson-3.9.2-cp38-cp38-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:8cd4385c59bbc1433cad4a80aca65d2d9039646a9c57f8084897549b55913b17"},
- {file = "orjson-3.9.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a74036aab1a80c361039290cdbc51aa7adc7ea13f56e5ef94e9be536abd227bd"},
- {file = "orjson-3.9.2-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1aaa46d7d4ae55335f635eadc9be0bd9bcf742e6757209fc6dc697e390010adc"},
- {file = "orjson-3.9.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2e52c67ed6bb368083aa2078ea3ccbd9721920b93d4b06c43eb4e20c4c860046"},
- {file = "orjson-3.9.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1a6cdfcf9c7dd4026b2b01fdff56986251dc0cc1e980c690c79eec3ae07b36e7"},
- {file = "orjson-3.9.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1882a70bb69595b9ec5aac0040a819e94d2833fe54901e2b32f5e734bc259a8b"},
- {file = "orjson-3.9.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:fc05e060d452145ab3c0b5420769e7356050ea311fc03cb9d79c481982917cca"},
- {file = "orjson-3.9.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:f8bc2c40d9bb26efefb10949d261a47ca196772c308babc538dd9f4b73e8d386"},
- {file = "orjson-3.9.2-cp38-none-win_amd64.whl", hash = "sha256:3164fc20a585ec30a9aff33ad5de3b20ce85702b2b2a456852c413e3f0d7ab09"},
- {file = "orjson-3.9.2-cp39-cp39-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:7a6ccadf788531595ed4728aa746bc271955448d2460ff0ef8e21eb3f2a281ba"},
- {file = "orjson-3.9.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3245d230370f571c945f69aab823c279a868dc877352817e22e551de155cb06c"},
- {file = "orjson-3.9.2-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:205925b179550a4ee39b8418dd4c94ad6b777d165d7d22614771c771d44f57bd"},
- {file = "orjson-3.9.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0325fe2d69512187761f7368c8cda1959bcb75fc56b8e7a884e9569112320e57"},
- {file = "orjson-3.9.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:806704cd58708acc66a064a9a58e3be25cf1c3f9f159e8757bd3f515bfabdfa1"},
- {file = "orjson-3.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:03fb36f187a0c19ff38f6289418863df8b9b7880cdbe279e920bef3a09d8dab1"},
- {file = "orjson-3.9.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:20925d07a97c49c6305bff1635318d9fc1804aa4ccacb5fb0deb8a910e57d97a"},
- {file = "orjson-3.9.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:eebfed53bec5674e981ebe8ed2cf00b3f7bcda62d634733ff779c264307ea505"},
- {file = "orjson-3.9.2-cp39-none-win_amd64.whl", hash = "sha256:869b961df5fcedf6c79f4096119b35679b63272362e9b745e668f0391a892d39"},
- {file = "orjson-3.9.2.tar.gz", hash = "sha256:24257c8f641979bf25ecd3e27251b5cc194cdd3a6e96004aac8446f5e63d9664"},
+ {file = "orjson-3.9.3-cp310-cp310-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:082714b5554fcced092c45272f22a93400389733083c43f5043c4316e86f57a2"},
+ {file = "orjson-3.9.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:97ddec69ca4fa1b66d512cf4f4a3fe6a57c4bf21209295ab2f4ada415996e08a"},
+ {file = "orjson-3.9.3-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ab7501722ec2172b1c6ea333bc47bba3bbb9b5fc0e3e891191e8447f43d3187d"},
+ {file = "orjson-3.9.3-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5ae680163ab09f04683d35fbd63eee858019f0066640f7cbad4dba3e7422a4bc"},
+ {file = "orjson-3.9.3-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7e5abca1e0a9d110bab7346fab0acd3b7848d2ee13318bc24a31bbfbdad974b8"},
+ {file = "orjson-3.9.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c55f42a8b07cdb7d514cfaeb56f6e9029eef1cbc8e670ac31fc377c46b993cd1"},
+ {file = "orjson-3.9.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:303f1324f5ea516f8e874ea0f8d15c581caabdca59fc990705fc76f3bd9f3bdf"},
+ {file = "orjson-3.9.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:59c444e3931ea4fe7dec26d195486a681fedc0233230c9b84848f8e60affd4a4"},
+ {file = "orjson-3.9.3-cp310-none-win32.whl", hash = "sha256:63333de96d83091023c9c99cc579973a2977b15feb5cdc8d9660104c886e9ab8"},
+ {file = "orjson-3.9.3-cp310-none-win_amd64.whl", hash = "sha256:7bce6ff507a83c6a4b6b00726f3a7d7aed0b1f0884aac0440e95b55cac0b113e"},
+ {file = "orjson-3.9.3-cp311-cp311-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:ec4421f377cce51decd6ea3869a8b41e9f05c50bf6acef8284f8906e642992c4"},
+ {file = "orjson-3.9.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7b3177bd67756e53bdbd72c79fae3507796a67b67c32a16f4b55cad48ef25c13"},
+ {file = "orjson-3.9.3-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:b21908252c8a13b8f48d4cccdb7fabb592824cf39c9fa4e9076015dd65eabeba"},
+ {file = "orjson-3.9.3-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f7b795c6ac344b0c49776b7e135a9bed0cd15b1ade2a4c7b3a19e3913247702e"},
+ {file = "orjson-3.9.3-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8ac43842f5ba26e6f21b4e63312bd1137111a9b9821d7f7dfe189a4015c6c6bc"},
+ {file = "orjson-3.9.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8def4f6560c7b6dbc4b356dfd8e6624a018d920ce5a2864291a2bf1052cd6b68"},
+ {file = "orjson-3.9.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:bbc0dafd1de42c8dbfd6e5d1fe4deab15d2de474e11475921286bebefd109ec8"},
+ {file = "orjson-3.9.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:85b1870d5420292419b34002659082d77f31b13d4d8cbd67bed9d717c775a0fb"},
+ {file = "orjson-3.9.3-cp311-none-win32.whl", hash = "sha256:d6ece3f48f14a06c325181f2b9bd9a9827aac2ecdcad11eb12f561fb697eaaaa"},
+ {file = "orjson-3.9.3-cp311-none-win_amd64.whl", hash = "sha256:448feda092c681c0a5b8eec62dd4f625ad5d316dafd56c81fb3f05b5221827ff"},
+ {file = "orjson-3.9.3-cp312-cp312-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:413d7cf731f1222373360128a3d5232d52630a7355f446bf2659fc3445ec0b76"},
+ {file = "orjson-3.9.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:009a0f79804c604998b068f5f942e40546913ed45ee2f0a3d0e75695bf7543fa"},
+ {file = "orjson-3.9.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:ce062844255cce4d6a8a150e8e78b9fcd6c5a3f1ff3f8792922de25827c25b9c"},
+ {file = "orjson-3.9.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:776659e18debe5de73c30b0957cd6454fcc61d87377fcb276441fca1b9f1305d"},
+ {file = "orjson-3.9.3-cp312-none-win_amd64.whl", hash = "sha256:47b237da3818c8e546df4d2162f0a5cfd50b7b58528907919a27244141e0e48e"},
+ {file = "orjson-3.9.3-cp37-cp37m-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:f954115d8496d4ab5975438e3ce07780c1644ea0a66c78a943ef79f33769b61a"},
+ {file = "orjson-3.9.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:05c57100517b6dbfe34181ed2248bebfab03bd2a7aafb6fbf849c6fd3bb2fbda"},
+ {file = "orjson-3.9.3-cp37-cp37m-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:aa6017140fe487ab8fae605a2890c94c6fbe7a8e763ff33bbdb00e27ce078cfd"},
+ {file = "orjson-3.9.3-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6fe77af2ff33c370fb06c9fdf004a66d85ea19c77f0273bbf70c70f98f832725"},
+ {file = "orjson-3.9.3-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e2fa8c385b27bab886caa098fa3ae114d56571ae6e7a5610cb624d7b0a66faed"},
+ {file = "orjson-3.9.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8323739e7905ae4ec4dbdebb31067d28be981f30c11b6ae88ddec2671c0b3194"},
+ {file = "orjson-3.9.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:ad43fd5b1ededb54fe01e67468710fcfec8a5830e4ce131f85e741ea151a18e9"},
+ {file = "orjson-3.9.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:42cb645780f732c829bc351346a54157d57f2bc409e671ee36b9fc1037bb77fe"},
+ {file = "orjson-3.9.3-cp37-none-win32.whl", hash = "sha256:b84542669d1b0175dc2870025b73cbd4f4a3beb17796de6ec82683663e0400f3"},
+ {file = "orjson-3.9.3-cp37-none-win_amd64.whl", hash = "sha256:1440a404ce84f43e2f8e97d8b5fe6f271458e0ffd37290dc3a9f6aa067c69930"},
+ {file = "orjson-3.9.3-cp38-cp38-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:1da8edaefb75f25b449ed4e22d00b9b49211b97dcefd44b742bdd8721d572788"},
+ {file = "orjson-3.9.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:47210746acda49febe3bb07253eb5d63d7c7511beec5fa702aad3ce64e15664f"},
+ {file = "orjson-3.9.3-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:893c62afd5b26f04e2814dffa4d9d4060583ac43dc3e79ed3eadf62a5ac37b2c"},
+ {file = "orjson-3.9.3-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:32aef33ae33901c327fd5679f91fa37199834d122dffd234416a6fe4193d1982"},
+ {file = "orjson-3.9.3-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bd2761384ddb9de63b20795845d5cedadf052255a34c3ff1750cfc77b29d9926"},
+ {file = "orjson-3.9.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:19e2502b4af2055050dcc74718f2647b65102087c6f5b3f939e2e1a3e3099602"},
+ {file = "orjson-3.9.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:fa7c7a39eeb8dd171f59d96fd4610f908ac14b2f2eb268f4498e5f310bda8da7"},
+ {file = "orjson-3.9.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:cc3fe0c0ae7acf00d827efe2506131f1b19af3c87e3d76b0e081748984e51c26"},
+ {file = "orjson-3.9.3-cp38-none-win32.whl", hash = "sha256:5b1ff8e920518753b310034e5796f0116f7732b0b27531012d46f0b54f3c8c85"},
+ {file = "orjson-3.9.3-cp38-none-win_amd64.whl", hash = "sha256:9f2b1007174c93dd838f52e623c972df33057e3cb7ad9341b7d9bbd66b8d8fb4"},
+ {file = "orjson-3.9.3-cp39-cp39-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:cddc5b8bd7b0d1dfd36637eedbd83726b8b8a5969d3ecee70a9b54a94b8a0258"},
+ {file = "orjson-3.9.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:43c3bbf4b6f94fad2fd73c81293da8b343fbd07ce48d7836c07d0d54b58c8e93"},
+ {file = "orjson-3.9.3-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a5cc22ef6973992db18952f8b978781e19a0c62c098f475db936284df9311df7"},
+ {file = "orjson-3.9.3-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9dcea93630986209c690f27f32398956b04ccbba8f1fa7c3d1bb88a01d9ab87a"},
+ {file = "orjson-3.9.3-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:526cb34e63faaad908c34597294507b7a4b999a436b4f206bc4e60ff4e911c20"},
+ {file = "orjson-3.9.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2f5ac6e30ee10af57f52e72f9c8b9bc4846a9343449d10ca2ae9760615da3042"},
+ {file = "orjson-3.9.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:b6c37ab097c062bdf535105c7156839c4e370065c476bb2393149ad31a2cdf6e"},
+ {file = "orjson-3.9.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:27d69628f449c52a7a34836b15ec948804254f7954457f88de53f2f4de99512f"},
+ {file = "orjson-3.9.3-cp39-none-win32.whl", hash = "sha256:5297463d8831c2327ed22bf92eb6d50347071ff1c73fb4702d50b8bc514aeac9"},
+ {file = "orjson-3.9.3-cp39-none-win_amd64.whl", hash = "sha256:69a33486b5b6e5a99939fdb13c1c0d8bcc7c89fe6083e7b9ce3c70931ca9fb71"},
+ {file = "orjson-3.9.3.tar.gz", hash = "sha256:d3da4faf6398154c1e75d32778035fa7dc284814809f76e8f8d50c4f54859399"},
]
[[package]]
name = "overrides"
-version = "7.3.1"
+version = "7.4.0"
description = "A decorator to automatically detect mismatch when overriding a method."
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
- {file = "overrides-7.3.1-py3-none-any.whl", hash = "sha256:6187d8710a935d09b0bcef8238301d6ee2569d2ac1ae0ec39a8c7924e27f58ca"},
- {file = "overrides-7.3.1.tar.gz", hash = "sha256:8b97c6c1e1681b78cbc9424b138d880f0803c2254c5ebaabdde57bb6c62093f2"},
+ {file = "overrides-7.4.0-py3-none-any.whl", hash = "sha256:3ad24583f86d6d7a49049695efe9933e67ba62f0c7625d53c59fa832ce4b8b7d"},
+ {file = "overrides-7.4.0.tar.gz", hash = "sha256:9502a3cca51f4fac40b5feca985b6703a5c1f6ad815588a7ca9e285b9dca6757"},
]
[[package]]
name = "packaging"
version = "23.1"
description = "Core utilities for Python packages"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -4220,7 +4172,6 @@ files = [
name = "pandas"
version = "2.0.3"
description = "Powerful data structures for data analysis, time series, and statistics"
-category = "main"
optional = false
python-versions = ">=3.8"
files = [
@@ -4287,7 +4238,6 @@ xml = ["lxml (>=4.6.3)"]
name = "pandas-stubs"
version = "2.0.2.230605"
description = "Type annotations for pandas"
-category = "dev"
optional = false
python-versions = ">=3.8"
files = [
@@ -4303,7 +4253,6 @@ types-pytz = ">=2022.1.1"
name = "parso"
version = "0.8.3"
description = "A Python Parser"
-category = "dev"
optional = false
python-versions = ">=3.6"
files = [
@@ -4317,21 +4266,19 @@ testing = ["docopt", "pytest (<6.0.0)"]
[[package]]
name = "pathspec"
-version = "0.11.1"
+version = "0.11.2"
description = "Utility library for gitignore style pattern matching of file paths."
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "pathspec-0.11.1-py3-none-any.whl", hash = "sha256:d8af70af76652554bd134c22b3e8a1cc46ed7d91edcdd721ef1a0c51a84a5293"},
- {file = "pathspec-0.11.1.tar.gz", hash = "sha256:2798de800fa92780e33acca925945e9a19a133b715067cf165b8866c15a31687"},
+ {file = "pathspec-0.11.2-py3-none-any.whl", hash = "sha256:1d6ed233af05e679efb96b1851550ea95bbb64b7c490b0f5aa52996c11e92a20"},
+ {file = "pathspec-0.11.2.tar.gz", hash = "sha256:e0d8d0ac2f12da61956eb2306b69f9469b42f4deb0f3cb6ed47b9cce9996ced3"},
]
[[package]]
name = "pdf2image"
version = "1.16.3"
description = "A wrapper around the pdftoppm and pdftocairo command line tools to convert PDF to a PIL Image list."
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -4346,7 +4293,6 @@ pillow = "*"
name = "pdfminer-six"
version = "20221105"
description = "PDF parser and analyzer"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -4367,7 +4313,6 @@ image = ["Pillow"]
name = "pexpect"
version = "4.8.0"
description = "Pexpect allows easy control of interactive console applications."
-category = "dev"
optional = false
python-versions = "*"
files = [
@@ -4382,7 +4327,6 @@ ptyprocess = ">=0.5"
name = "pickleshare"
version = "0.7.5"
description = "Tiny 'shelve'-like database with concurrency support"
-category = "dev"
optional = false
python-versions = "*"
files = [
@@ -4394,7 +4338,6 @@ files = [
name = "pillow"
version = "10.0.0"
description = "Python Imaging Library (Fork)"
-category = "main"
optional = false
python-versions = ">=3.8"
files = [
@@ -4462,7 +4405,6 @@ tests = ["check-manifest", "coverage", "defusedxml", "markdown2", "olefile", "pa
name = "pinecone-client"
version = "2.2.2"
description = "Pinecone client and SDK"
-category = "main"
optional = false
python-versions = ">=3.8"
files = [
@@ -4488,7 +4430,6 @@ grpc = ["googleapis-common-protos (>=1.53.0)", "grpc-gateway-protoc-gen-openapiv
name = "pkginfo"
version = "1.9.6"
description = "Query metadata from sdists / bdists / installed packages."
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -4501,25 +4442,23 @@ testing = ["pytest", "pytest-cov"]
[[package]]
name = "platformdirs"
-version = "3.9.1"
+version = "3.10.0"
description = "A small Python package for determining appropriate platform-specific dirs, e.g. a \"user data dir\"."
-category = "dev"
optional = false
python-versions = ">=3.7"
files = [
- {file = "platformdirs-3.9.1-py3-none-any.whl", hash = "sha256:ad8291ae0ae5072f66c16945166cb11c63394c7a3ad1b1bc9828ca3162da8c2f"},
- {file = "platformdirs-3.9.1.tar.gz", hash = "sha256:1b42b450ad933e981d56e59f1b97495428c9bd60698baab9f3eb3d00d5822421"},
+ {file = "platformdirs-3.10.0-py3-none-any.whl", hash = "sha256:d7c24979f292f916dc9cbf8648319032f551ea8c49a4c9bf2fb556a02070ec1d"},
+ {file = "platformdirs-3.10.0.tar.gz", hash = "sha256:b45696dab2d7cc691a3226759c0d3b00c47c8b6e293d96f6436f733303f77f6d"},
]
[package.extras]
-docs = ["furo (>=2023.5.20)", "proselint (>=0.13)", "sphinx (>=7.0.1)", "sphinx-autodoc-typehints (>=1.23,!=1.23.4)"]
-test = ["appdirs (==1.4.4)", "covdefaults (>=2.3)", "pytest (>=7.3.1)", "pytest-cov (>=4.1)", "pytest-mock (>=3.10)"]
+docs = ["furo (>=2023.7.26)", "proselint (>=0.13)", "sphinx (>=7.1.1)", "sphinx-autodoc-typehints (>=1.24)"]
+test = ["appdirs (==1.4.4)", "covdefaults (>=2.3)", "pytest (>=7.4)", "pytest-cov (>=4.1)", "pytest-mock (>=3.11.1)"]
[[package]]
name = "pluggy"
version = "1.2.0"
description = "plugin and hook calling mechanisms for python"
-category = "dev"
optional = false
python-versions = ">=3.7"
files = [
@@ -4535,7 +4474,6 @@ testing = ["pytest", "pytest-benchmark"]
name = "portalocker"
version = "2.7.0"
description = "Wraps the portalocker recipe for easy usage"
-category = "main"
optional = false
python-versions = ">=3.5"
files = [
@@ -4555,7 +4493,6 @@ tests = ["pytest (>=5.4.1)", "pytest-cov (>=2.8.1)", "pytest-mypy (>=0.8.0)", "p
name = "postgrest"
version = "0.10.6"
description = "PostgREST client for Python. This library provides an ORM interface to PostgREST."
-category = "main"
optional = false
python-versions = ">=3.8,<4.0"
files = [
@@ -4573,7 +4510,6 @@ strenum = ">=0.4.9,<0.5.0"
name = "posthog"
version = "3.0.1"
description = "Integrate PostHog into any python application."
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -4597,7 +4533,6 @@ test = ["coverage", "flake8", "freezegun (==0.3.15)", "mock (>=2.0.0)", "pylint"
name = "prometheus-client"
version = "0.17.1"
description = "Python client for the Prometheus monitoring system."
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -4612,7 +4547,6 @@ twisted = ["twisted"]
name = "prompt-toolkit"
version = "3.0.39"
description = "Library for building powerful interactive command lines in Python"
-category = "dev"
optional = false
python-versions = ">=3.7.0"
files = [
@@ -4627,7 +4561,6 @@ wcwidth = "*"
name = "proto-plus"
version = "1.22.3"
description = "Beautiful, Pythonic protocol buffers."
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -4645,7 +4578,6 @@ testing = ["google-api-core[grpc] (>=1.31.5)"]
name = "protobuf"
version = "3.20.3"
description = "Protocol Buffers"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -4677,7 +4609,6 @@ files = [
name = "psutil"
version = "5.9.5"
description = "Cross-platform lib for process and system monitoring in Python."
-category = "dev"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
files = [
@@ -4702,14 +4633,13 @@ test = ["enum34", "ipaddress", "mock", "pywin32", "wmi"]
[[package]]
name = "psycopg"
-version = "3.1.9"
+version = "3.1.10"
description = "PostgreSQL database adapter for Python"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "psycopg-3.1.9-py3-none-any.whl", hash = "sha256:fbbac339274d8733ee70ba9822297af3e8871790a26e967b5ea53e30a4b74dcc"},
- {file = "psycopg-3.1.9.tar.gz", hash = "sha256:ab400f207a8c120bafdd8077916d8f6c0106e809401378708485b016508c30c9"},
+ {file = "psycopg-3.1.10-py3-none-any.whl", hash = "sha256:8bbeddae5075c7890b2fa3e3553440376d3c5e28418335dee3c3656b06fa2b52"},
+ {file = "psycopg-3.1.10.tar.gz", hash = "sha256:15b25741494344c24066dc2479b0f383dd1b82fa5e75612fa4fa5bb30726e9b6"},
]
[package.dependencies]
@@ -4717,154 +4647,149 @@ typing-extensions = ">=4.1"
tzdata = {version = "*", markers = "sys_platform == \"win32\""}
[package.extras]
-binary = ["psycopg-binary (==3.1.9)"]
-c = ["psycopg-c (==3.1.9)"]
-dev = ["black (>=23.1.0)", "dnspython (>=2.1)", "flake8 (>=4.0)", "mypy (>=1.2)", "types-setuptools (>=57.4)", "wheel (>=0.37)"]
+binary = ["psycopg-binary (==3.1.10)"]
+c = ["psycopg-c (==3.1.10)"]
+dev = ["black (>=23.1.0)", "dnspython (>=2.1)", "flake8 (>=4.0)", "mypy (>=1.4.1)", "types-setuptools (>=57.4)", "wheel (>=0.37)"]
docs = ["Sphinx (>=5.0)", "furo (==2022.6.21)", "sphinx-autobuild (>=2021.3.14)", "sphinx-autodoc-typehints (>=1.12)"]
pool = ["psycopg-pool"]
-test = ["anyio (>=3.6.2)", "mypy (>=1.2)", "pproxy (>=2.7)", "pytest (>=6.2.5)", "pytest-cov (>=3.0)", "pytest-randomly (>=3.5)"]
+test = ["anyio (>=3.6.2)", "mypy (>=1.4.1)", "pproxy (>=2.7)", "pytest (>=6.2.5)", "pytest-cov (>=3.0)", "pytest-randomly (>=3.5)"]
[[package]]
name = "psycopg-binary"
-version = "3.1.9"
+version = "3.1.10"
description = "PostgreSQL database adapter for Python -- C optimisation distribution"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "psycopg_binary-3.1.9-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:284038cbe3f5a0f3de417af9b5eaa2a9524a3a06211523cf245111c71b566506"},
- {file = "psycopg_binary-3.1.9-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d2cea4bb0b19245c83486868d7c66f73238c4caa266b5b3c3d664d10dab2ab56"},
- {file = "psycopg_binary-3.1.9-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dfe5c5c31f59ccb1d1f473466baa93d800138186286e80e251f930e49c80d208"},
- {file = "psycopg_binary-3.1.9-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:82704a899d57c29beba5399d41eab5ef5c238b810d7e25e2d1916d2b34c4b1a3"},
- {file = "psycopg_binary-3.1.9-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:eab449e39db1c429cac79b7aa27e6827aad4995f32137e922db7254f43fed7b5"},
- {file = "psycopg_binary-3.1.9-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:87e0c97733b11eeca3d24e56df70f3f9d792b2abd46f48be2fb2348ffc3e7e39"},
- {file = "psycopg_binary-3.1.9-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:81e34d6df54329424944d5ca91b1cc77df6b8a9130cb5480680d56f53d4e485c"},
- {file = "psycopg_binary-3.1.9-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e2f463079d99568a343ed0b766150b30627e9ed41de99fd82e945e7e2bec764a"},
- {file = "psycopg_binary-3.1.9-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:f2cbdef6568da21c39dfd45c2074e85eabbd00e1b721832ba94980f01f582dd4"},
- {file = "psycopg_binary-3.1.9-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:53afb0cc2ebe74651f339e22d05ec082a0f44939715d9138d357852f074fcf55"},
- {file = "psycopg_binary-3.1.9-cp310-cp310-win_amd64.whl", hash = "sha256:09167f106e7685591b4cdf58eff0191fb7435d586f384133a0dd30df646cf409"},
- {file = "psycopg_binary-3.1.9-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a8aaa47c1791fc05c0229ec1003dd49e13238fba9434e1fc3b879632f749c3c4"},
- {file = "psycopg_binary-3.1.9-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:3d91ee0d33ac7b42d0488a9be2516efa2ec00901b81d69566ff34a7a94b66c0b"},
- {file = "psycopg_binary-3.1.9-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f5e36504373e5bcdc954b1da1c6fe66379007fe1e329790e8fb72b879a01e097"},
- {file = "psycopg_binary-3.1.9-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4c1def6c2d28e257325b3b208cf1966343b498282a0f4d390fda7b7e0577da64"},
- {file = "psycopg_binary-3.1.9-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:055537a9c20efe9bf17cb72bd879602eda71de6f737ebafa1953e017c6a37fbe"},
- {file = "psycopg_binary-3.1.9-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5b164355d023a91b23dcc4bb3112bc7d6e9b9c938fb5abcb6e54457d2da1f317"},
- {file = "psycopg_binary-3.1.9-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:03b08545ce1c627f4d5e6384eda2946660c4ba6ceb0a09ae47de07419f725669"},
- {file = "psycopg_binary-3.1.9-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:1e31bac3d2d41e6446b20b591f638943328c958f4d1ce13d6f1c5db97c3a8dee"},
- {file = "psycopg_binary-3.1.9-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:a274c63c8fb9d419509bed2ef72befc1fd04243972e17e7f5afc5725cb13a560"},
- {file = "psycopg_binary-3.1.9-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:98d9d156b9ada08c271a79662fc5fcc1731b4d7c1f651ef5843d818d35f15ba0"},
- {file = "psycopg_binary-3.1.9-cp311-cp311-win_amd64.whl", hash = "sha256:c3a13aa022853891cadbc7256a9804e5989def760115c82334bddf0d19783b0b"},
- {file = "psycopg_binary-3.1.9-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:b1a321ef3579a8de0545ade6ff1edfde0c88b8847d58c5615c03751c76054796"},
- {file = "psycopg_binary-3.1.9-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5833bda4c14f24c6a8ac08d3c5712acaa4f35aab31f9ccd2265e9e9a7d0151c8"},
- {file = "psycopg_binary-3.1.9-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a207d5a7f4212443b7452851c9ccd88df9c6d4d58fa2cea2ead4dd9cb328e578"},
- {file = "psycopg_binary-3.1.9-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:07414daa86662f7657e9fabe49af85a32a975e92e6568337887d9c9ffedc224f"},
- {file = "psycopg_binary-3.1.9-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:17c5d4936c746f5125c6ef9eb43655e27d4d0c9ffe34c3073878b43c3192511d"},
- {file = "psycopg_binary-3.1.9-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:5cdc13c8ec1437240801e43d07e27ff6479ac9dd8583ecf647345bfd2e8390e4"},
- {file = "psycopg_binary-3.1.9-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:3836bdaf030a5648bd5f5b452e4b068b265e28f9199060c5b70dbf4a218cde6e"},
- {file = "psycopg_binary-3.1.9-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:96725d9691a84a21eb3e81c884a2e043054e33e176801a57a05e9ac38d142c6e"},
- {file = "psycopg_binary-3.1.9-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:dade344aa90bb0b57d1cfc13304ed83ab9a36614b8ddd671381b2de72fe1483d"},
- {file = "psycopg_binary-3.1.9-cp37-cp37m-win_amd64.whl", hash = "sha256:db866cc557d9761036771d666d17fa4176c537af7e6098f42a6bf8f64217935f"},
- {file = "psycopg_binary-3.1.9-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:3b62545cc64dd69ea0ae5ffe18d7c97e03660ab8244aa8c5172668a21c41daa0"},
- {file = "psycopg_binary-3.1.9-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:058ab0d79be0b229338f0e61fec6f475077518cba63c22c593645a69f01c3e23"},
- {file = "psycopg_binary-3.1.9-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2340ca2531f69e5ebd9d18987362ba57ed6ab6a271511d8026814a46a2a87b59"},
- {file = "psycopg_binary-3.1.9-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3b816ce0e27a2a8786d34b61d3e36e01029245025879d64b88554326b794a4f0"},
- {file = "psycopg_binary-3.1.9-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7b36fe4314a784fbe45c9fd71c902b9bf57341aff9b97c0cbd22f8409a271e2f"},
- {file = "psycopg_binary-3.1.9-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b246fed629482b06f938b23e9281c4af592329daa3ec2cd4a6841ccbfdeb4d68"},
- {file = "psycopg_binary-3.1.9-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:90787ac05b932c0fc678cbf470ccea9c385b8077583f0490136b4569ed3fb652"},
- {file = "psycopg_binary-3.1.9-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:9c114f678e8f4a96530fa79cfd84f65f26358ecfc6cca70cfa2d5e3ae5ef217a"},
- {file = "psycopg_binary-3.1.9-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:3a82e77400d1ef6c5bbcf3e600e8bdfacf1a554512f96c090c43ceca3d1ce3b6"},
- {file = "psycopg_binary-3.1.9-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:c7d990f14a37345ca05a5192cd5ac938c9cbedca9c929872af6ae311158feb0e"},
- {file = "psycopg_binary-3.1.9-cp38-cp38-win_amd64.whl", hash = "sha256:e0ca74fd85718723bb9f08e0c6898e901a0c365aef20b3c3a4ef8709125d6210"},
- {file = "psycopg_binary-3.1.9-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:ce8f4dea5934aa6c4933e559c74bef4beb3413f51fbcf17f306ce890216ac33a"},
- {file = "psycopg_binary-3.1.9-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f41a9e0de4db194c053bcc7c00c35422a4d19d92a8187e8065b1c560626efe35"},
- {file = "psycopg_binary-3.1.9-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2f94a7985135e084e122b143956c6f589d17aef743ecd0a434a3d3a222631d5a"},
- {file = "psycopg_binary-3.1.9-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3bb86d58b90faefdc0bbedf08fdea4cc2afcb1cfa4340f027d458bfd01d8b812"},
- {file = "psycopg_binary-3.1.9-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6c696dc84f9ff155761df15779181d8e4af7746b98908e130add8259912e4bb7"},
- {file = "psycopg_binary-3.1.9-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4213953da44324850c8f789301cf665f46fb94301ba403301e7af58546c3a428"},
- {file = "psycopg_binary-3.1.9-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:25e3ce947aaaa1bd9f1920fca76d7281660646304f9ea5bc036b201dd8790655"},
- {file = "psycopg_binary-3.1.9-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9c75be2a9b986139e3ff6bc0a2852081ac00811040f9b82d3aa539821311122e"},
- {file = "psycopg_binary-3.1.9-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:63e8d1dbe253657c70dbfa9c59423f4654d82698fc5ed6868b8dc0765abe20b6"},
- {file = "psycopg_binary-3.1.9-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:f4da4ca9b2365fc1d3fc741c3bbd3efccd892ce813444b884c8911a1acf1c932"},
- {file = "psycopg_binary-3.1.9-cp39-cp39-win_amd64.whl", hash = "sha256:c0b8d6bbeff1dba760a208d8bc205a05b745e6cee02b839f969f72cf56a8b80d"},
+ {file = "psycopg_binary-3.1.10-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:a529c203f6e0f4c67ba27cf8f9739eb3bc880ad70d6ad6c0e56c2230a66b5a09"},
+ {file = "psycopg_binary-3.1.10-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:bd6e14d1aeb12754a43446c77a5ce819b68875cc25ae6538089ef90d7f6dd6f7"},
+ {file = "psycopg_binary-3.1.10-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1583ced5948cf88124212c4503dfe5b01ac3e2dd1a2833c083917f4c4aabe8b4"},
+ {file = "psycopg_binary-3.1.10-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2098721c486478987be700723b28ec7a48f134eba339de36af0e745f37dfe461"},
+ {file = "psycopg_binary-3.1.10-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7e61f7b412fca7b15dd043a0b22fd528d2ed8276e76b3764c3889e29fa65082b"},
+ {file = "psycopg_binary-3.1.10-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e0f33e33a072e3d5af51ee4d4a439e10dbe623fe87ef295d5d688180d529f13f"},
+ {file = "psycopg_binary-3.1.10-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:f6f7738c59262d8d19154164d99c881ed58ed377fb6f1d685eb0dc43bbcd8022"},
+ {file = "psycopg_binary-3.1.10-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:511d38b1e1961d179d47d5103ba9634ecfc7ead431d19a9337ef82f3a2bca807"},
+ {file = "psycopg_binary-3.1.10-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:666e7acf2ffdb5e8a58e8b0c1759facdb9688c7e90ee8ca7aed675803b57404d"},
+ {file = "psycopg_binary-3.1.10-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:57b93c756fee5f7c7bd580c34cd5d244f7d5638f8b2cf25333f97b9b8b2ebfd1"},
+ {file = "psycopg_binary-3.1.10-cp310-cp310-win_amd64.whl", hash = "sha256:a1d61b7724c7215a8ea4495a5c6b704656f4b7bb6165f4cb9989b685886ebc48"},
+ {file = "psycopg_binary-3.1.10-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:36fff836a7823c9d71fa7faa333c74b2b081af216cebdbb0f481dce55ee2d974"},
+ {file = "psycopg_binary-3.1.10-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:32caf98cb00881bfcbbbae39a15f2a4e08b79ff983f1c0f13b60a888ef6e8431"},
+ {file = "psycopg_binary-3.1.10-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5565a6a86fee8d74f30de89e07f399567cdf59367aeb09624eb690d524339076"},
+ {file = "psycopg_binary-3.1.10-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9fb0d64520b29bd80a6731476ad8e1c20348dfdee00ab098899d23247b641675"},
+ {file = "psycopg_binary-3.1.10-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bfc05ed4e74fa8615d7cc2bd57f00f97662f4e865a731dbd43da9a527e289c8c"},
+ {file = "psycopg_binary-3.1.10-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c5b59c8cff887757ddf438ff9489d79c5e6b717112c96f5c68e16f367ff8724e"},
+ {file = "psycopg_binary-3.1.10-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a4cbaf12361136afefc5faab21a174a437e71c803b083f410e5140c7605bc66b"},
+ {file = "psycopg_binary-3.1.10-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:ff72576061c774bcce5f5440b93e63d4c430032dd056d30f6cb1988e549dd92c"},
+ {file = "psycopg_binary-3.1.10-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:a4e91e1a8d61c60f592a1dfcebdf55e52a29fe4fdb650c5bd5414c848e77d029"},
+ {file = "psycopg_binary-3.1.10-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:f7187269d825e84c945be7d93dd5088a4e0b6481a4bdaba3bf7069d4ac13703d"},
+ {file = "psycopg_binary-3.1.10-cp311-cp311-win_amd64.whl", hash = "sha256:ba7812a593c16d9d661844dc8dd4d81548fd1c2a0ee676f3e3d8638369f4c5e4"},
+ {file = "psycopg_binary-3.1.10-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:88caa5859740507b3596c6c2e00ceaccee2c6ab5317bc535887801ad3cc7f3e1"},
+ {file = "psycopg_binary-3.1.10-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4a3a7e99ba10c2e83a48d79431560e0d5ca7865f68f2bac3a462dc2b151e9926"},
+ {file = "psycopg_binary-3.1.10-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:848f4f4707dc73f4b4e844c92f3de795b2ddb728f75132602bda5e6ba55084fc"},
+ {file = "psycopg_binary-3.1.10-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:415961e839bb49cfd75cd961503fb8846c0768f247db1fa7171c1ac61d38711b"},
+ {file = "psycopg_binary-3.1.10-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0471869e658d0c6b8c3ed53153794739c18d7dad2dd5b8e6ff023a364c20f7df"},
+ {file = "psycopg_binary-3.1.10-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:4290060ee0d856caa979ecf675c0e6959325f508272ccf27f64c3801c7bcbde7"},
+ {file = "psycopg_binary-3.1.10-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:abf04bc06c8f6a1ac3dc2106d3b79c8661352e9d8a57ca2934ffa6aae8fe600a"},
+ {file = "psycopg_binary-3.1.10-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:51fe70708243b83bf16710d8c11b61bd46562e6a24a6300d5434380b35911059"},
+ {file = "psycopg_binary-3.1.10-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8b658f7f8b49fb60a1c52e3f6692f690a85bdf1ad30aafe0f3f1fd74f6958cf8"},
+ {file = "psycopg_binary-3.1.10-cp37-cp37m-win_amd64.whl", hash = "sha256:ffc8c796194f23b9b07f6d25f927ec4df84a194bbc7a1f9e73316734eef512f9"},
+ {file = "psycopg_binary-3.1.10-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:74ce92122be34cf0e5f06d79869e1001c8421a68fa7ddf6fe38a717155cf3a64"},
+ {file = "psycopg_binary-3.1.10-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:75608a900984061c8898be68fbddc6f3da5eefdffce6e0624f5371645740d172"},
+ {file = "psycopg_binary-3.1.10-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6670d160d054466e8fdedfbc749ef8bf7dfdf69296048954d24645dd4d3d3c01"},
+ {file = "psycopg_binary-3.1.10-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d32026cfab7ba7ac687a42c33345026a2fb6fc5608a6144077f767af4386be0b"},
+ {file = "psycopg_binary-3.1.10-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:908fa388a5b75dfd17a937acb24708bd272e21edefca9a495004c6f70ec2636a"},
+ {file = "psycopg_binary-3.1.10-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1e46b97073bd4de114f475249d681eaf054e950699c5d7af554d3684db39b82d"},
+ {file = "psycopg_binary-3.1.10-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:9cf56bb4b115def3a18157f3b3b7d8322ee94a8dea30028db602c8f9ae34ad1e"},
+ {file = "psycopg_binary-3.1.10-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:3b6c6f90241c4c5a6ca3f0d8827e37ef90fdc4deb9d8cfa5678baa0ea374b391"},
+ {file = "psycopg_binary-3.1.10-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:747176a6aeb058079f56c5397bd90339581ab7b3cc0d62e7445654e6a484c7e1"},
+ {file = "psycopg_binary-3.1.10-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:41a415e78c457b06497fa0084e4ea7245ca1a377b55756dd757034210b64da7e"},
+ {file = "psycopg_binary-3.1.10-cp38-cp38-win_amd64.whl", hash = "sha256:a7bbe9017edd898d7b3a8747700ed045dda96a907dff87f45e642e28d8584481"},
+ {file = "psycopg_binary-3.1.10-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:0f062f20256708929a58c41d44f350efced4c00a603323d1413f6dc0b84d95a5"},
+ {file = "psycopg_binary-3.1.10-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:dea30f2704337ca2d0322fccfe1fa30f61ce9185de3937eb986321063114a51f"},
+ {file = "psycopg_binary-3.1.10-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b9d88ac72531034ebf7ec09114e732b066a9078f4ce213cf65cc5e42eb538d30"},
+ {file = "psycopg_binary-3.1.10-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f2bea0940d69c3e24a72530730952687912893b34c53aa39e79045e7b446174d"},
+ {file = "psycopg_binary-3.1.10-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6a691dc8e2436d9c1e5cf93902d63e9501688fccc957eb22f952d37886257470"},
+ {file = "psycopg_binary-3.1.10-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fa92661f99351765673835a4d936d79bd24dfbb358b29b084d83be38229a90e4"},
+ {file = "psycopg_binary-3.1.10-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:30eb731ed5525d8df892db6532cc8ffd8a163b73bc355127dee9c49334e16eee"},
+ {file = "psycopg_binary-3.1.10-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:50bf7a59d3a85a82d466fed341d352b44d09d6adc18656101d163a7cfc6509a0"},
+ {file = "psycopg_binary-3.1.10-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:f48665947c55f8d6eb3f0be98de80411508e1ec329f354685329b57fced82c7f"},
+ {file = "psycopg_binary-3.1.10-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:caa771569da01fc0389ca34920c331a284425a68f92d1ba0a80cc08935f8356e"},
+ {file = "psycopg_binary-3.1.10-cp39-cp39-win_amd64.whl", hash = "sha256:b30887e631fd67affaed98f6cd2135b44f2d1a6d9bca353a69c3889c78bd7aa8"},
]
[[package]]
name = "psycopg2-binary"
-version = "2.9.6"
+version = "2.9.7"
description = "psycopg2 - Python-PostgreSQL Database Adapter"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
- {file = "psycopg2-binary-2.9.6.tar.gz", hash = "sha256:1f64dcfb8f6e0c014c7f55e51c9759f024f70ea572fbdef123f85318c297947c"},
- {file = "psycopg2_binary-2.9.6-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d26e0342183c762de3276cca7a530d574d4e25121ca7d6e4a98e4f05cb8e4df7"},
- {file = "psycopg2_binary-2.9.6-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:c48d8f2db17f27d41fb0e2ecd703ea41984ee19362cbce52c097963b3a1b4365"},
- {file = "psycopg2_binary-2.9.6-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ffe9dc0a884a8848075e576c1de0290d85a533a9f6e9c4e564f19adf8f6e54a7"},
- {file = "psycopg2_binary-2.9.6-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8a76e027f87753f9bd1ab5f7c9cb8c7628d1077ef927f5e2446477153a602f2c"},
- {file = "psycopg2_binary-2.9.6-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6460c7a99fc939b849431f1e73e013d54aa54293f30f1109019c56a0b2b2ec2f"},
- {file = "psycopg2_binary-2.9.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ae102a98c547ee2288637af07393dd33f440c25e5cd79556b04e3fca13325e5f"},
- {file = "psycopg2_binary-2.9.6-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:9972aad21f965599ed0106f65334230ce826e5ae69fda7cbd688d24fa922415e"},
- {file = "psycopg2_binary-2.9.6-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:7a40c00dbe17c0af5bdd55aafd6ff6679f94a9be9513a4c7e071baf3d7d22a70"},
- {file = "psycopg2_binary-2.9.6-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:cacbdc5839bdff804dfebc058fe25684cae322987f7a38b0168bc1b2df703fb1"},
- {file = "psycopg2_binary-2.9.6-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:7f0438fa20fb6c7e202863e0d5ab02c246d35efb1d164e052f2f3bfe2b152bd0"},
- {file = "psycopg2_binary-2.9.6-cp310-cp310-win32.whl", hash = "sha256:b6c8288bb8a84b47e07013bb4850f50538aa913d487579e1921724631d02ea1b"},
- {file = "psycopg2_binary-2.9.6-cp310-cp310-win_amd64.whl", hash = "sha256:61b047a0537bbc3afae10f134dc6393823882eb263088c271331602b672e52e9"},
- {file = "psycopg2_binary-2.9.6-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:964b4dfb7c1c1965ac4c1978b0f755cc4bd698e8aa2b7667c575fb5f04ebe06b"},
- {file = "psycopg2_binary-2.9.6-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:afe64e9b8ea66866a771996f6ff14447e8082ea26e675a295ad3bdbffdd72afb"},
- {file = "psycopg2_binary-2.9.6-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:15e2ee79e7cf29582ef770de7dab3d286431b01c3bb598f8e05e09601b890081"},
- {file = "psycopg2_binary-2.9.6-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:dfa74c903a3c1f0d9b1c7e7b53ed2d929a4910e272add6700c38f365a6002820"},
- {file = "psycopg2_binary-2.9.6-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b83456c2d4979e08ff56180a76429263ea254c3f6552cd14ada95cff1dec9bb8"},
- {file = "psycopg2_binary-2.9.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0645376d399bfd64da57148694d78e1f431b1e1ee1054872a5713125681cf1be"},
- {file = "psycopg2_binary-2.9.6-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e99e34c82309dd78959ba3c1590975b5d3c862d6f279f843d47d26ff89d7d7e1"},
- {file = "psycopg2_binary-2.9.6-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:4ea29fc3ad9d91162c52b578f211ff1c931d8a38e1f58e684c45aa470adf19e2"},
- {file = "psycopg2_binary-2.9.6-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:4ac30da8b4f57187dbf449294d23b808f8f53cad6b1fc3623fa8a6c11d176dd0"},
- {file = "psycopg2_binary-2.9.6-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e78e6e2a00c223e164c417628572a90093c031ed724492c763721c2e0bc2a8df"},
- {file = "psycopg2_binary-2.9.6-cp311-cp311-win32.whl", hash = "sha256:1876843d8e31c89c399e31b97d4b9725a3575bb9c2af92038464231ec40f9edb"},
- {file = "psycopg2_binary-2.9.6-cp311-cp311-win_amd64.whl", hash = "sha256:b4b24f75d16a89cc6b4cdff0eb6a910a966ecd476d1e73f7ce5985ff1328e9a6"},
- {file = "psycopg2_binary-2.9.6-cp36-cp36m-win32.whl", hash = "sha256:498807b927ca2510baea1b05cc91d7da4718a0f53cb766c154c417a39f1820a0"},
- {file = "psycopg2_binary-2.9.6-cp36-cp36m-win_amd64.whl", hash = "sha256:0d236c2825fa656a2d98bbb0e52370a2e852e5a0ec45fc4f402977313329174d"},
- {file = "psycopg2_binary-2.9.6-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:34b9ccdf210cbbb1303c7c4db2905fa0319391bd5904d32689e6dd5c963d2ea8"},
- {file = "psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:84d2222e61f313c4848ff05353653bf5f5cf6ce34df540e4274516880d9c3763"},
- {file = "psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:30637a20623e2a2eacc420059be11527f4458ef54352d870b8181a4c3020ae6b"},
- {file = "psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8122cfc7cae0da9a3077216528b8bb3629c43b25053284cc868744bfe71eb141"},
- {file = "psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:38601cbbfe600362c43714482f43b7c110b20cb0f8172422c616b09b85a750c5"},
- {file = "psycopg2_binary-2.9.6-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c7e62ab8b332147a7593a385d4f368874d5fe4ad4e341770d4983442d89603e3"},
- {file = "psycopg2_binary-2.9.6-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:2ab652e729ff4ad76d400df2624d223d6e265ef81bb8aa17fbd63607878ecbee"},
- {file = "psycopg2_binary-2.9.6-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:c83a74b68270028dc8ee74d38ecfaf9c90eed23c8959fca95bd703d25b82c88e"},
- {file = "psycopg2_binary-2.9.6-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:d4e6036decf4b72d6425d5b29bbd3e8f0ff1059cda7ac7b96d6ac5ed34ffbacd"},
- {file = "psycopg2_binary-2.9.6-cp37-cp37m-win32.whl", hash = "sha256:a8c28fd40a4226b4a84bdf2d2b5b37d2c7bd49486b5adcc200e8c7ec991dfa7e"},
- {file = "psycopg2_binary-2.9.6-cp37-cp37m-win_amd64.whl", hash = "sha256:51537e3d299be0db9137b321dfb6a5022caaab275775680e0c3d281feefaca6b"},
- {file = "psycopg2_binary-2.9.6-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:cf4499e0a83b7b7edcb8dabecbd8501d0d3a5ef66457200f77bde3d210d5debb"},
- {file = "psycopg2_binary-2.9.6-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:7e13a5a2c01151f1208d5207e42f33ba86d561b7a89fca67c700b9486a06d0e2"},
- {file = "psycopg2_binary-2.9.6-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e0f754d27fddcfd74006455b6e04e6705d6c31a612ec69ddc040a5468e44b4e"},
- {file = "psycopg2_binary-2.9.6-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d57c3fd55d9058645d26ae37d76e61156a27722097229d32a9e73ed54819982a"},
- {file = "psycopg2_binary-2.9.6-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:71f14375d6f73b62800530b581aed3ada394039877818b2d5f7fc77e3bb6894d"},
- {file = "psycopg2_binary-2.9.6-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:441cc2f8869a4f0f4bb408475e5ae0ee1f3b55b33f350406150277f7f35384fc"},
- {file = "psycopg2_binary-2.9.6-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:65bee1e49fa6f9cf327ce0e01c4c10f39165ee76d35c846ade7cb0ec6683e303"},
- {file = "psycopg2_binary-2.9.6-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:af335bac6b666cc6aea16f11d486c3b794029d9df029967f9938a4bed59b6a19"},
- {file = "psycopg2_binary-2.9.6-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:cfec476887aa231b8548ece2e06d28edc87c1397ebd83922299af2e051cf2827"},
- {file = "psycopg2_binary-2.9.6-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:65c07febd1936d63bfde78948b76cd4c2a411572a44ac50719ead41947d0f26b"},
- {file = "psycopg2_binary-2.9.6-cp38-cp38-win32.whl", hash = "sha256:4dfb4be774c4436a4526d0c554af0cc2e02082c38303852a36f6456ece7b3503"},
- {file = "psycopg2_binary-2.9.6-cp38-cp38-win_amd64.whl", hash = "sha256:02c6e3cf3439e213e4ee930308dc122d6fb4d4bea9aef4a12535fbd605d1a2fe"},
- {file = "psycopg2_binary-2.9.6-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:e9182eb20f41417ea1dd8e8f7888c4d7c6e805f8a7c98c1081778a3da2bee3e4"},
- {file = "psycopg2_binary-2.9.6-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:8a6979cf527e2603d349a91060f428bcb135aea2be3201dff794813256c274f1"},
- {file = "psycopg2_binary-2.9.6-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8338a271cb71d8da40b023a35d9c1e919eba6cbd8fa20a54b748a332c355d896"},
- {file = "psycopg2_binary-2.9.6-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e3ed340d2b858d6e6fb5083f87c09996506af483227735de6964a6100b4e6a54"},
- {file = "psycopg2_binary-2.9.6-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f81e65376e52f03422e1fb475c9514185669943798ed019ac50410fb4c4df232"},
- {file = "psycopg2_binary-2.9.6-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfb13af3c5dd3a9588000910178de17010ebcccd37b4f9794b00595e3a8ddad3"},
- {file = "psycopg2_binary-2.9.6-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:4c727b597c6444a16e9119386b59388f8a424223302d0c06c676ec8b4bc1f963"},
- {file = "psycopg2_binary-2.9.6-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:4d67fbdaf177da06374473ef6f7ed8cc0a9dc640b01abfe9e8a2ccb1b1402c1f"},
- {file = "psycopg2_binary-2.9.6-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:0892ef645c2fabb0c75ec32d79f4252542d0caec1d5d949630e7d242ca4681a3"},
- {file = "psycopg2_binary-2.9.6-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:02c0f3757a4300cf379eb49f543fb7ac527fb00144d39246ee40e1df684ab514"},
- {file = "psycopg2_binary-2.9.6-cp39-cp39-win32.whl", hash = "sha256:c3dba7dab16709a33a847e5cd756767271697041fbe3fe97c215b1fc1f5c9848"},
- {file = "psycopg2_binary-2.9.6-cp39-cp39-win_amd64.whl", hash = "sha256:f6a88f384335bb27812293fdb11ac6aee2ca3f51d3c7820fe03de0a304ab6249"},
+ {file = "psycopg2-binary-2.9.7.tar.gz", hash = "sha256:1b918f64a51ffe19cd2e230b3240ba481330ce1d4b7875ae67305bd1d37b041c"},
+ {file = "psycopg2_binary-2.9.7-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:ea5f8ee87f1eddc818fc04649d952c526db4426d26bab16efbe5a0c52b27d6ab"},
+ {file = "psycopg2_binary-2.9.7-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:2993ccb2b7e80844d534e55e0f12534c2871952f78e0da33c35e648bf002bbff"},
+ {file = "psycopg2_binary-2.9.7-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dbbc3c5d15ed76b0d9db7753c0db40899136ecfe97d50cbde918f630c5eb857a"},
+ {file = "psycopg2_binary-2.9.7-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:692df8763b71d42eb8343f54091368f6f6c9cfc56dc391858cdb3c3ef1e3e584"},
+ {file = "psycopg2_binary-2.9.7-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9dcfd5d37e027ec393a303cc0a216be564b96c80ba532f3d1e0d2b5e5e4b1e6e"},
+ {file = "psycopg2_binary-2.9.7-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:17cc17a70dfb295a240db7f65b6d8153c3d81efb145d76da1e4a096e9c5c0e63"},
+ {file = "psycopg2_binary-2.9.7-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:e5666632ba2b0d9757b38fc17337d84bdf932d38563c5234f5f8c54fd01349c9"},
+ {file = "psycopg2_binary-2.9.7-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:7db7b9b701974c96a88997d458b38ccb110eba8f805d4b4f74944aac48639b42"},
+ {file = "psycopg2_binary-2.9.7-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:c82986635a16fb1fa15cd5436035c88bc65c3d5ced1cfaac7f357ee9e9deddd4"},
+ {file = "psycopg2_binary-2.9.7-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:4fe13712357d802080cfccbf8c6266a3121dc0e27e2144819029095ccf708372"},
+ {file = "psycopg2_binary-2.9.7-cp310-cp310-win32.whl", hash = "sha256:122641b7fab18ef76b18860dd0c772290566b6fb30cc08e923ad73d17461dc63"},
+ {file = "psycopg2_binary-2.9.7-cp310-cp310-win_amd64.whl", hash = "sha256:f8651cf1f144f9ee0fa7d1a1df61a9184ab72962531ca99f077bbdcba3947c58"},
+ {file = "psycopg2_binary-2.9.7-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:4ecc15666f16f97709106d87284c136cdc82647e1c3f8392a672616aed3c7151"},
+ {file = "psycopg2_binary-2.9.7-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:3fbb1184c7e9d28d67671992970718c05af5f77fc88e26fd7136613c4ece1f89"},
+ {file = "psycopg2_binary-2.9.7-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8a7968fd20bd550431837656872c19575b687f3f6f98120046228e451e4064df"},
+ {file = "psycopg2_binary-2.9.7-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:094af2e77a1976efd4956a031028774b827029729725e136514aae3cdf49b87b"},
+ {file = "psycopg2_binary-2.9.7-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:26484e913d472ecb6b45937ea55ce29c57c662066d222fb0fbdc1fab457f18c5"},
+ {file = "psycopg2_binary-2.9.7-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f309b77a7c716e6ed9891b9b42953c3ff7d533dc548c1e33fddc73d2f5e21f9"},
+ {file = "psycopg2_binary-2.9.7-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:6d92e139ca388ccfe8c04aacc163756e55ba4c623c6ba13d5d1595ed97523e4b"},
+ {file = "psycopg2_binary-2.9.7-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:2df562bb2e4e00ee064779902d721223cfa9f8f58e7e52318c97d139cf7f012d"},
+ {file = "psycopg2_binary-2.9.7-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:4eec5d36dbcfc076caab61a2114c12094c0b7027d57e9e4387b634e8ab36fd44"},
+ {file = "psycopg2_binary-2.9.7-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:1011eeb0c51e5b9ea1016f0f45fa23aca63966a4c0afcf0340ccabe85a9f65bd"},
+ {file = "psycopg2_binary-2.9.7-cp311-cp311-win32.whl", hash = "sha256:ded8e15f7550db9e75c60b3d9fcbc7737fea258a0f10032cdb7edc26c2a671fd"},
+ {file = "psycopg2_binary-2.9.7-cp311-cp311-win_amd64.whl", hash = "sha256:8a136c8aaf6615653450817a7abe0fc01e4ea720ae41dfb2823eccae4b9062a3"},
+ {file = "psycopg2_binary-2.9.7-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:2dec5a75a3a5d42b120e88e6ed3e3b37b46459202bb8e36cd67591b6e5feebc1"},
+ {file = "psycopg2_binary-2.9.7-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fc10da7e7df3380426521e8c1ed975d22df678639da2ed0ec3244c3dc2ab54c8"},
+ {file = "psycopg2_binary-2.9.7-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ee919b676da28f78f91b464fb3e12238bd7474483352a59c8a16c39dfc59f0c5"},
+ {file = "psycopg2_binary-2.9.7-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:eb1c0e682138f9067a58fc3c9a9bf1c83d8e08cfbee380d858e63196466d5c86"},
+ {file = "psycopg2_binary-2.9.7-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:00d8db270afb76f48a499f7bb8fa70297e66da67288471ca873db88382850bf4"},
+ {file = "psycopg2_binary-2.9.7-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:9b0c2b466b2f4d89ccc33784c4ebb1627989bd84a39b79092e560e937a11d4ac"},
+ {file = "psycopg2_binary-2.9.7-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:51d1b42d44f4ffb93188f9b39e6d1c82aa758fdb8d9de65e1ddfe7a7d250d7ad"},
+ {file = "psycopg2_binary-2.9.7-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:11abdbfc6f7f7dea4a524b5f4117369b0d757725798f1593796be6ece20266cb"},
+ {file = "psycopg2_binary-2.9.7-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:f02f4a72cc3ab2565c6d9720f0343cb840fb2dc01a2e9ecb8bc58ccf95dc5c06"},
+ {file = "psycopg2_binary-2.9.7-cp37-cp37m-win32.whl", hash = "sha256:81d5dd2dd9ab78d31a451e357315f201d976c131ca7d43870a0e8063b6b7a1ec"},
+ {file = "psycopg2_binary-2.9.7-cp37-cp37m-win_amd64.whl", hash = "sha256:62cb6de84d7767164a87ca97e22e5e0a134856ebcb08f21b621c6125baf61f16"},
+ {file = "psycopg2_binary-2.9.7-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:59f7e9109a59dfa31efa022e94a244736ae401526682de504e87bd11ce870c22"},
+ {file = "psycopg2_binary-2.9.7-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:95a7a747bdc3b010bb6a980f053233e7610276d55f3ca506afff4ad7749ab58a"},
+ {file = "psycopg2_binary-2.9.7-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8c721ee464e45ecf609ff8c0a555018764974114f671815a0a7152aedb9f3343"},
+ {file = "psycopg2_binary-2.9.7-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f4f37bbc6588d402980ffbd1f3338c871368fb4b1cfa091debe13c68bb3852b3"},
+ {file = "psycopg2_binary-2.9.7-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ac83ab05e25354dad798401babaa6daa9577462136ba215694865394840e31f8"},
+ {file = "psycopg2_binary-2.9.7-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:024eaeb2a08c9a65cd5f94b31ace1ee3bb3f978cd4d079406aef85169ba01f08"},
+ {file = "psycopg2_binary-2.9.7-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:1c31c2606ac500dbd26381145684d87730a2fac9a62ebcfbaa2b119f8d6c19f4"},
+ {file = "psycopg2_binary-2.9.7-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:42a62ef0e5abb55bf6ffb050eb2b0fcd767261fa3faf943a4267539168807522"},
+ {file = "psycopg2_binary-2.9.7-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:7952807f95c8eba6a8ccb14e00bf170bb700cafcec3924d565235dffc7dc4ae8"},
+ {file = "psycopg2_binary-2.9.7-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:e02bc4f2966475a7393bd0f098e1165d470d3fa816264054359ed4f10f6914ea"},
+ {file = "psycopg2_binary-2.9.7-cp38-cp38-win32.whl", hash = "sha256:fdca0511458d26cf39b827a663d7d87db6f32b93efc22442a742035728603d5f"},
+ {file = "psycopg2_binary-2.9.7-cp38-cp38-win_amd64.whl", hash = "sha256:d0b16e5bb0ab78583f0ed7ab16378a0f8a89a27256bb5560402749dbe8a164d7"},
+ {file = "psycopg2_binary-2.9.7-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6822c9c63308d650db201ba22fe6648bd6786ca6d14fdaf273b17e15608d0852"},
+ {file = "psycopg2_binary-2.9.7-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:8f94cb12150d57ea433e3e02aabd072205648e86f1d5a0a692d60242f7809b15"},
+ {file = "psycopg2_binary-2.9.7-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a5ee89587696d808c9a00876065d725d4ae606f5f7853b961cdbc348b0f7c9a1"},
+ {file = "psycopg2_binary-2.9.7-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ad5ec10b53cbb57e9a2e77b67e4e4368df56b54d6b00cc86398578f1c635f329"},
+ {file = "psycopg2_binary-2.9.7-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:642df77484b2dcaf87d4237792246d8068653f9e0f5c025e2c692fc56b0dda70"},
+ {file = "psycopg2_binary-2.9.7-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a6a8b575ac45af1eaccbbcdcf710ab984fd50af048fe130672377f78aaff6fc1"},
+ {file = "psycopg2_binary-2.9.7-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:f955aa50d7d5220fcb6e38f69ea126eafecd812d96aeed5d5f3597f33fad43bb"},
+ {file = "psycopg2_binary-2.9.7-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:ad26d4eeaa0d722b25814cce97335ecf1b707630258f14ac4d2ed3d1d8415265"},
+ {file = "psycopg2_binary-2.9.7-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:ced63c054bdaf0298f62681d5dcae3afe60cbae332390bfb1acf0e23dcd25fc8"},
+ {file = "psycopg2_binary-2.9.7-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:2b04da24cbde33292ad34a40db9832a80ad12de26486ffeda883413c9e1b1d5e"},
+ {file = "psycopg2_binary-2.9.7-cp39-cp39-win32.whl", hash = "sha256:18f12632ab516c47c1ac4841a78fddea6508a8284c7cf0f292cb1a523f2e2379"},
+ {file = "psycopg2_binary-2.9.7-cp39-cp39-win_amd64.whl", hash = "sha256:eb3b8d55924a6058a26db69fb1d3e7e32695ff8b491835ba9f479537e14dcf9f"},
]
[[package]]
name = "ptyprocess"
version = "0.7.0"
description = "Run a subprocess in a pseudo terminal"
-category = "dev"
optional = false
python-versions = "*"
files = [
@@ -4876,7 +4801,6 @@ files = [
name = "pulsar-client"
version = "3.2.0"
description = "Apache Pulsar Python client library"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -4924,7 +4848,6 @@ functions = ["apache-bookkeeper-client (>=4.16.1)", "grpcio (>=1.8.2)", "prometh
name = "pure-eval"
version = "0.2.2"
description = "Safely evaluate AST nodes without side effects"
-category = "dev"
optional = false
python-versions = "*"
files = [
@@ -4935,11 +4858,21 @@ files = [
[package.extras]
tests = ["pytest"]
+[[package]]
+name = "py-cpuinfo"
+version = "9.0.0"
+description = "Get CPU info with pure Python"
+optional = true
+python-versions = "*"
+files = [
+ {file = "py-cpuinfo-9.0.0.tar.gz", hash = "sha256:3cdbbf3fac90dc6f118bfd64384f309edeadd902d7c8fb17f02ffa1fc3f49690"},
+ {file = "py_cpuinfo-9.0.0-py3-none-any.whl", hash = "sha256:859625bc251f64e21f077d099d4162689c762b5d6a4c3c97553d56241c9674d5"},
+]
+
[[package]]
name = "pyarrow"
version = "12.0.1"
description = "Python library for Apache Arrow"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -4977,7 +4910,6 @@ numpy = ">=1.16.6"
name = "pyasn1"
version = "0.5.0"
description = "Pure-Python implementation of ASN.1 types and DER/BER/CER codecs (X.208)"
-category = "main"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
files = [
@@ -4989,7 +4921,6 @@ files = [
name = "pyasn1-modules"
version = "0.3.0"
description = "A collection of ASN.1-based protocols modules"
-category = "main"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
files = [
@@ -5004,7 +4935,6 @@ pyasn1 = ">=0.4.6,<0.6.0"
name = "pycparser"
version = "2.21"
description = "C parser in Python"
-category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
files = [
@@ -5014,48 +4944,47 @@ files = [
[[package]]
name = "pydantic"
-version = "1.10.11"
+version = "1.10.12"
description = "Data validation and settings management using python type hints"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "pydantic-1.10.11-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:ff44c5e89315b15ff1f7fdaf9853770b810936d6b01a7bcecaa227d2f8fe444f"},
- {file = "pydantic-1.10.11-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:a6c098d4ab5e2d5b3984d3cb2527e2d6099d3de85630c8934efcfdc348a9760e"},
- {file = "pydantic-1.10.11-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:16928fdc9cb273c6af00d9d5045434c39afba5f42325fb990add2c241402d151"},
- {file = "pydantic-1.10.11-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0588788a9a85f3e5e9ebca14211a496409cb3deca5b6971ff37c556d581854e7"},
- {file = "pydantic-1.10.11-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e9baf78b31da2dc3d3f346ef18e58ec5f12f5aaa17ac517e2ffd026a92a87588"},
- {file = "pydantic-1.10.11-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:373c0840f5c2b5b1ccadd9286782852b901055998136287828731868027a724f"},
- {file = "pydantic-1.10.11-cp310-cp310-win_amd64.whl", hash = "sha256:c3339a46bbe6013ef7bdd2844679bfe500347ac5742cd4019a88312aa58a9847"},
- {file = "pydantic-1.10.11-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:08a6c32e1c3809fbc49debb96bf833164f3438b3696abf0fbeceb417d123e6eb"},
- {file = "pydantic-1.10.11-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a451ccab49971af043ec4e0d207cbc8cbe53dbf148ef9f19599024076fe9c25b"},
- {file = "pydantic-1.10.11-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5b02d24f7b2b365fed586ed73582c20f353a4c50e4be9ba2c57ab96f8091ddae"},
- {file = "pydantic-1.10.11-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3f34739a89260dfa420aa3cbd069fbcc794b25bbe5c0a214f8fb29e363484b66"},
- {file = "pydantic-1.10.11-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:e297897eb4bebde985f72a46a7552a7556a3dd11e7f76acda0c1093e3dbcf216"},
- {file = "pydantic-1.10.11-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d185819a7a059550ecb85d5134e7d40f2565f3dd94cfd870132c5f91a89cf58c"},
- {file = "pydantic-1.10.11-cp311-cp311-win_amd64.whl", hash = "sha256:4400015f15c9b464c9db2d5d951b6a780102cfa5870f2c036d37c23b56f7fc1b"},
- {file = "pydantic-1.10.11-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:2417de68290434461a266271fc57274a138510dca19982336639484c73a07af6"},
- {file = "pydantic-1.10.11-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:331c031ba1554b974c98679bd0780d89670d6fd6f53f5d70b10bdc9addee1713"},
- {file = "pydantic-1.10.11-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8268a735a14c308923e8958363e3a3404f6834bb98c11f5ab43251a4e410170c"},
- {file = "pydantic-1.10.11-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:44e51ba599c3ef227e168424e220cd3e544288c57829520dc90ea9cb190c3248"},
- {file = "pydantic-1.10.11-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:d7781f1d13b19700b7949c5a639c764a077cbbdd4322ed505b449d3ca8edcb36"},
- {file = "pydantic-1.10.11-cp37-cp37m-win_amd64.whl", hash = "sha256:7522a7666157aa22b812ce14c827574ddccc94f361237ca6ea8bb0d5c38f1629"},
- {file = "pydantic-1.10.11-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:bc64eab9b19cd794a380179ac0e6752335e9555d214cfcb755820333c0784cb3"},
- {file = "pydantic-1.10.11-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:8dc77064471780262b6a68fe67e013298d130414d5aaf9b562c33987dbd2cf4f"},
- {file = "pydantic-1.10.11-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fe429898f2c9dd209bd0632a606bddc06f8bce081bbd03d1c775a45886e2c1cb"},
- {file = "pydantic-1.10.11-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:192c608ad002a748e4a0bed2ddbcd98f9b56df50a7c24d9a931a8c5dd053bd3d"},
- {file = "pydantic-1.10.11-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:ef55392ec4bb5721f4ded1096241e4b7151ba6d50a50a80a2526c854f42e6a2f"},
- {file = "pydantic-1.10.11-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:41e0bb6efe86281623abbeeb0be64eab740c865388ee934cd3e6a358784aca6e"},
- {file = "pydantic-1.10.11-cp38-cp38-win_amd64.whl", hash = "sha256:265a60da42f9f27e0b1014eab8acd3e53bd0bad5c5b4884e98a55f8f596b2c19"},
- {file = "pydantic-1.10.11-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:469adf96c8e2c2bbfa655fc7735a2a82f4c543d9fee97bd113a7fb509bf5e622"},
- {file = "pydantic-1.10.11-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:e6cbfbd010b14c8a905a7b10f9fe090068d1744d46f9e0c021db28daeb8b6de1"},
- {file = "pydantic-1.10.11-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:abade85268cc92dff86d6effcd917893130f0ff516f3d637f50dadc22ae93999"},
- {file = "pydantic-1.10.11-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e9738b0f2e6c70f44ee0de53f2089d6002b10c33264abee07bdb5c7f03038303"},
- {file = "pydantic-1.10.11-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:787cf23e5a0cde753f2eabac1b2e73ae3844eb873fd1f5bdbff3048d8dbb7604"},
- {file = "pydantic-1.10.11-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:174899023337b9fc685ac8adaa7b047050616136ccd30e9070627c1aaab53a13"},
- {file = "pydantic-1.10.11-cp39-cp39-win_amd64.whl", hash = "sha256:1954f8778489a04b245a1e7b8b22a9d3ea8ef49337285693cf6959e4b757535e"},
- {file = "pydantic-1.10.11-py3-none-any.whl", hash = "sha256:008c5e266c8aada206d0627a011504e14268a62091450210eda7c07fabe6963e"},
- {file = "pydantic-1.10.11.tar.gz", hash = "sha256:f66d479cf7eb331372c470614be6511eae96f1f120344c25f3f9bb59fb1b5528"},
+ {file = "pydantic-1.10.12-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:a1fcb59f2f355ec350073af41d927bf83a63b50e640f4dbaa01053a28b7a7718"},
+ {file = "pydantic-1.10.12-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:b7ccf02d7eb340b216ec33e53a3a629856afe1c6e0ef91d84a4e6f2fb2ca70fe"},
+ {file = "pydantic-1.10.12-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8fb2aa3ab3728d950bcc885a2e9eff6c8fc40bc0b7bb434e555c215491bcf48b"},
+ {file = "pydantic-1.10.12-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:771735dc43cf8383959dc9b90aa281f0b6092321ca98677c5fb6125a6f56d58d"},
+ {file = "pydantic-1.10.12-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:ca48477862372ac3770969b9d75f1bf66131d386dba79506c46d75e6b48c1e09"},
+ {file = "pydantic-1.10.12-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:a5e7add47a5b5a40c49b3036d464e3c7802f8ae0d1e66035ea16aa5b7a3923ed"},
+ {file = "pydantic-1.10.12-cp310-cp310-win_amd64.whl", hash = "sha256:e4129b528c6baa99a429f97ce733fff478ec955513630e61b49804b6cf9b224a"},
+ {file = "pydantic-1.10.12-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:b0d191db0f92dfcb1dec210ca244fdae5cbe918c6050b342d619c09d31eea0cc"},
+ {file = "pydantic-1.10.12-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:795e34e6cc065f8f498c89b894a3c6da294a936ee71e644e4bd44de048af1405"},
+ {file = "pydantic-1.10.12-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:69328e15cfda2c392da4e713443c7dbffa1505bc9d566e71e55abe14c97ddc62"},
+ {file = "pydantic-1.10.12-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2031de0967c279df0d8a1c72b4ffc411ecd06bac607a212892757db7462fc494"},
+ {file = "pydantic-1.10.12-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:ba5b2e6fe6ca2b7e013398bc7d7b170e21cce322d266ffcd57cca313e54fb246"},
+ {file = "pydantic-1.10.12-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:2a7bac939fa326db1ab741c9d7f44c565a1d1e80908b3797f7f81a4f86bc8d33"},
+ {file = "pydantic-1.10.12-cp311-cp311-win_amd64.whl", hash = "sha256:87afda5539d5140cb8ba9e8b8c8865cb5b1463924d38490d73d3ccfd80896b3f"},
+ {file = "pydantic-1.10.12-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:549a8e3d81df0a85226963611950b12d2d334f214436a19537b2efed61b7639a"},
+ {file = "pydantic-1.10.12-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:598da88dfa127b666852bef6d0d796573a8cf5009ffd62104094a4fe39599565"},
+ {file = "pydantic-1.10.12-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ba5c4a8552bff16c61882db58544116d021d0b31ee7c66958d14cf386a5b5350"},
+ {file = "pydantic-1.10.12-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c79e6a11a07da7374f46970410b41d5e266f7f38f6a17a9c4823db80dadf4303"},
+ {file = "pydantic-1.10.12-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:ab26038b8375581dc832a63c948f261ae0aa21f1d34c1293469f135fa92972a5"},
+ {file = "pydantic-1.10.12-cp37-cp37m-win_amd64.whl", hash = "sha256:e0a16d274b588767602b7646fa05af2782576a6cf1022f4ba74cbb4db66f6ca8"},
+ {file = "pydantic-1.10.12-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6a9dfa722316f4acf4460afdf5d41d5246a80e249c7ff475c43a3a1e9d75cf62"},
+ {file = "pydantic-1.10.12-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:a73f489aebd0c2121ed974054cb2759af8a9f747de120acd2c3394cf84176ccb"},
+ {file = "pydantic-1.10.12-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6b30bcb8cbfccfcf02acb8f1a261143fab622831d9c0989707e0e659f77a18e0"},
+ {file = "pydantic-1.10.12-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2fcfb5296d7877af406ba1547dfde9943b1256d8928732267e2653c26938cd9c"},
+ {file = "pydantic-1.10.12-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:2f9a6fab5f82ada41d56b0602606a5506aab165ca54e52bc4545028382ef1c5d"},
+ {file = "pydantic-1.10.12-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:dea7adcc33d5d105896401a1f37d56b47d443a2b2605ff8a969a0ed5543f7e33"},
+ {file = "pydantic-1.10.12-cp38-cp38-win_amd64.whl", hash = "sha256:1eb2085c13bce1612da8537b2d90f549c8cbb05c67e8f22854e201bde5d98a47"},
+ {file = "pydantic-1.10.12-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:ef6c96b2baa2100ec91a4b428f80d8f28a3c9e53568219b6c298c1125572ebc6"},
+ {file = "pydantic-1.10.12-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:6c076be61cd0177a8433c0adcb03475baf4ee91edf5a4e550161ad57fc90f523"},
+ {file = "pydantic-1.10.12-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2d5a58feb9a39f481eda4d5ca220aa8b9d4f21a41274760b9bc66bfd72595b86"},
+ {file = "pydantic-1.10.12-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e5f805d2d5d0a41633651a73fa4ecdd0b3d7a49de4ec3fadf062fe16501ddbf1"},
+ {file = "pydantic-1.10.12-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:1289c180abd4bd4555bb927c42ee42abc3aee02b0fb2d1223fb7c6e5bef87dbe"},
+ {file = "pydantic-1.10.12-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:5d1197e462e0364906cbc19681605cb7c036f2475c899b6f296104ad42b9f5fb"},
+ {file = "pydantic-1.10.12-cp39-cp39-win_amd64.whl", hash = "sha256:fdbdd1d630195689f325c9ef1a12900524dceb503b00a987663ff4f58669b93d"},
+ {file = "pydantic-1.10.12-py3-none-any.whl", hash = "sha256:b749a43aa51e32839c9d71dc67eb1e4221bb04af1033a32e3923d46f9effa942"},
+ {file = "pydantic-1.10.12.tar.gz", hash = "sha256:0fe8a415cea8f340e7a9af9c54fc71a649b43e8ca3cc732986116b3cb135d303"},
]
[package.dependencies]
@@ -5067,14 +4996,13 @@ email = ["email-validator (>=1.0.3)"]
[[package]]
name = "pygments"
-version = "2.15.1"
+version = "2.16.1"
description = "Pygments is a syntax highlighting package written in Python."
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "Pygments-2.15.1-py3-none-any.whl", hash = "sha256:db2db3deb4b4179f399a09054b023b6a586b76499d36965813c71aa8ed7b5fd1"},
- {file = "Pygments-2.15.1.tar.gz", hash = "sha256:8ace4d3c1dd481894b2005f560ead0f9f19ee64fe983366be1a21e171d12775c"},
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
]
[package.extras]
@@ -5084,7 +5012,6 @@ plugins = ["importlib-metadata"]
name = "pymongo"
version = "4.4.1"
description = "Python driver for MongoDB "
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -5179,7 +5106,6 @@ zstd = ["zstandard"]
name = "pypandoc"
version = "1.11"
description = "Thin wrapper for pandoc."
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -5189,14 +5115,13 @@ files = [
[[package]]
name = "pyparsing"
-version = "3.1.0"
+version = "3.1.1"
description = "pyparsing module - Classes and methods to define and execute parsing grammars"
-category = "main"
optional = false
python-versions = ">=3.6.8"
files = [
- {file = "pyparsing-3.1.0-py3-none-any.whl", hash = "sha256:d554a96d1a7d3ddaf7183104485bc19fd80543ad6ac5bdb6426719d766fb06c1"},
- {file = "pyparsing-3.1.0.tar.gz", hash = "sha256:edb662d6fe322d6e990b1594b5feaeadf806803359e3d4d42f11e295e588f0ea"},
+ {file = "pyparsing-3.1.1-py3-none-any.whl", hash = "sha256:32c7c0b711493c72ff18a981d24f28aaf9c1fb7ed5e9667c9e84e3db623bdbfb"},
+ {file = "pyparsing-3.1.1.tar.gz", hash = "sha256:ede28a1a32462f5a9705e07aea48001a08f7cf81a021585011deba701581a0db"},
]
[package.extras]
@@ -5204,31 +5129,29 @@ diagrams = ["jinja2", "railroad-diagrams"]
[[package]]
name = "pypdf"
-version = "3.13.0"
+version = "3.15.0"
description = "A pure-python PDF library capable of splitting, merging, cropping, and transforming PDF files"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
- {file = "pypdf-3.13.0-py3-none-any.whl", hash = "sha256:b77ded83c019fc9554837182de8397d34fbc9eb634b1e6a1b87c1484b52d4f3f"},
- {file = "pypdf-3.13.0.tar.gz", hash = "sha256:417e2ee36178e0540dba7f25121359de81e99e899f07ef4bdc4da75a3e17cabc"},
+ {file = "pypdf-3.15.0-py3-none-any.whl", hash = "sha256:2e29ddb62561ec91157c784783714703ddd3ce08f070ecbc57404fb86cd9fc97"},
+ {file = "pypdf-3.15.0.tar.gz", hash = "sha256:8a6264e1c47c63dc2484e29bdfa76b121435896a84e94b7c5ae82c6ae96354bb"},
]
[package.dependencies]
typing_extensions = {version = ">=3.10.0.0", markers = "python_version < \"3.10\""}
[package.extras]
-crypto = ["PyCryptodome"]
+crypto = ["PyCryptodome", "cryptography"]
dev = ["black", "flit", "pip-tools", "pre-commit (<2.18.0)", "pytest-cov", "pytest-socket", "wheel"]
docs = ["myst_parser", "sphinx", "sphinx_rtd_theme"]
-full = ["Pillow", "PyCryptodome"]
-image = ["Pillow"]
+full = ["Pillow (>=8.0.0)", "PyCryptodome", "cryptography"]
+image = ["Pillow (>=8.0.0)"]
[[package]]
name = "pyreadline3"
version = "3.4.1"
description = "A python implementation of GNU readline."
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -5240,7 +5163,6 @@ files = [
name = "pysrt"
version = "1.1.2"
description = "SubRip (.srt) subtitle parser and writer"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -5254,7 +5176,6 @@ chardet = "*"
name = "pytest"
version = "7.4.0"
description = "pytest: simple powerful testing with Python"
-category = "dev"
optional = false
python-versions = ">=3.7"
files = [
@@ -5277,7 +5198,6 @@ testing = ["argcomplete", "attrs (>=19.2.0)", "hypothesis (>=3.56)", "mock", "no
name = "pytest-cov"
version = "4.1.0"
description = "Pytest plugin for measuring coverage."
-category = "dev"
optional = false
python-versions = ">=3.7"
files = [
@@ -5296,7 +5216,6 @@ testing = ["fields", "hunter", "process-tests", "pytest-xdist", "six", "virtuale
name = "python-dateutil"
version = "2.8.2"
description = "Extensions to the standard Python datetime module"
-category = "main"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7"
files = [
@@ -5311,7 +5230,6 @@ six = ">=1.5"
name = "python-docx"
version = "0.8.11"
description = "Create and update Microsoft Word .docx files."
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -5325,7 +5243,6 @@ lxml = ">=2.3.2"
name = "python-dotenv"
version = "1.0.0"
description = "Read key-value pairs from a .env file and set them as environment variables"
-category = "main"
optional = false
python-versions = ">=3.8"
files = [
@@ -5340,7 +5257,6 @@ cli = ["click (>=5.0)"]
name = "python-gitlab"
version = "3.15.0"
description = "Interact with GitLab API"
-category = "main"
optional = false
python-versions = ">=3.7.0"
files = [
@@ -5360,7 +5276,6 @@ yaml = ["PyYaml (>=5.2)"]
name = "python-jose"
version = "3.3.0"
description = "JOSE implementation in Python"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -5382,7 +5297,6 @@ pycryptodome = ["pyasn1", "pycryptodome (>=3.3.1,<4.0.0)"]
name = "python-magic"
version = "0.4.27"
description = "File type identification using libmagic"
-category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
files = [
@@ -5394,7 +5308,6 @@ files = [
name = "python-multipart"
version = "0.0.6"
description = "A streaming multipart parser for Python"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -5409,7 +5322,6 @@ dev = ["atomicwrites (==1.2.1)", "attrs (==19.2.0)", "coverage (==6.5.0)", "hatc
name = "python-pptx"
version = "0.6.21"
description = "Generate and manipulate Open XML PowerPoint (.pptx) files"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -5425,7 +5337,6 @@ XlsxWriter = ">=0.5.7"
name = "python-semantic-release"
version = "7.33.2"
description = "Automatic Semantic Versioning for Python projects"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -5457,7 +5368,6 @@ test = ["coverage (>=5,<6)", "mock (==1.3.0)", "pytest (>=7,<8)", "pytest-mock (
name = "pytz"
version = "2023.3"
description = "World timezone definitions, modern and historical"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -5469,7 +5379,6 @@ files = [
name = "pywin32"
version = "306"
description = "Python for Window Extensions"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -5493,7 +5402,6 @@ files = [
name = "pywin32-ctypes"
version = "0.2.2"
description = "A (partial) reimplementation of pywin32 using ctypes/cffi"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -5505,7 +5413,6 @@ files = [
name = "pyyaml"
version = "6.0.1"
description = "YAML parser and emitter for Python"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -5553,89 +5460,104 @@ files = [
[[package]]
name = "pyzmq"
-version = "25.1.0"
+version = "25.1.1"
description = "Python bindings for 0MQ"
-category = "dev"
optional = false
python-versions = ">=3.6"
files = [
- {file = "pyzmq-25.1.0-cp310-cp310-macosx_10_15_universal2.whl", hash = "sha256:1a6169e69034eaa06823da6a93a7739ff38716142b3596c180363dee729d713d"},
- {file = "pyzmq-25.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:19d0383b1f18411d137d891cab567de9afa609b214de68b86e20173dc624c101"},
- {file = "pyzmq-25.1.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f1e931d9a92f628858a50f5bdffdfcf839aebe388b82f9d2ccd5d22a38a789dc"},
- {file = "pyzmq-25.1.0-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:97d984b1b2f574bc1bb58296d3c0b64b10e95e7026f8716ed6c0b86d4679843f"},
- {file = "pyzmq-25.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:154bddda2a351161474b36dba03bf1463377ec226a13458725183e508840df89"},
- {file = "pyzmq-25.1.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:cb6d161ae94fb35bb518b74bb06b7293299c15ba3bc099dccd6a5b7ae589aee3"},
- {file = "pyzmq-25.1.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:90146ab578931e0e2826ee39d0c948d0ea72734378f1898939d18bc9c823fcf9"},
- {file = "pyzmq-25.1.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:831ba20b660b39e39e5ac8603e8193f8fce1ee03a42c84ade89c36a251449d80"},
- {file = "pyzmq-25.1.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:3a522510e3434e12aff80187144c6df556bb06fe6b9d01b2ecfbd2b5bfa5c60c"},
- {file = "pyzmq-25.1.0-cp310-cp310-win32.whl", hash = "sha256:be24a5867b8e3b9dd5c241de359a9a5217698ff616ac2daa47713ba2ebe30ad1"},
- {file = "pyzmq-25.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:5693dcc4f163481cf79e98cf2d7995c60e43809e325b77a7748d8024b1b7bcba"},
- {file = "pyzmq-25.1.0-cp311-cp311-macosx_10_15_universal2.whl", hash = "sha256:13bbe36da3f8aaf2b7ec12696253c0bf6ffe05f4507985a8844a1081db6ec22d"},
- {file = "pyzmq-25.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:69511d604368f3dc58d4be1b0bad99b61ee92b44afe1cd9b7bd8c5e34ea8248a"},
- {file = "pyzmq-25.1.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4a983c8694667fd76d793ada77fd36c8317e76aa66eec75be2653cef2ea72883"},
- {file = "pyzmq-25.1.0-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:332616f95eb400492103ab9d542b69d5f0ff628b23129a4bc0a2fd48da6e4e0b"},
- {file = "pyzmq-25.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:58416db767787aedbfd57116714aad6c9ce57215ffa1c3758a52403f7c68cff5"},
- {file = "pyzmq-25.1.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:cad9545f5801a125f162d09ec9b724b7ad9b6440151b89645241d0120e119dcc"},
- {file = "pyzmq-25.1.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:d6128d431b8dfa888bf51c22a04d48bcb3d64431caf02b3cb943269f17fd2994"},
- {file = "pyzmq-25.1.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:2b15247c49d8cbea695b321ae5478d47cffd496a2ec5ef47131a9e79ddd7e46c"},
- {file = "pyzmq-25.1.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:442d3efc77ca4d35bee3547a8e08e8d4bb88dadb54a8377014938ba98d2e074a"},
- {file = "pyzmq-25.1.0-cp311-cp311-win32.whl", hash = "sha256:65346f507a815a731092421d0d7d60ed551a80d9b75e8b684307d435a5597425"},
- {file = "pyzmq-25.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:8b45d722046fea5a5694cba5d86f21f78f0052b40a4bbbbf60128ac55bfcc7b6"},
- {file = "pyzmq-25.1.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:f45808eda8b1d71308c5416ef3abe958f033fdbb356984fabbfc7887bed76b3f"},
- {file = "pyzmq-25.1.0-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8b697774ea8273e3c0460cf0bba16cd85ca6c46dfe8b303211816d68c492e132"},
- {file = "pyzmq-25.1.0-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b324fa769577fc2c8f5efcd429cef5acbc17d63fe15ed16d6dcbac2c5eb00849"},
- {file = "pyzmq-25.1.0-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:5873d6a60b778848ce23b6c0ac26c39e48969823882f607516b91fb323ce80e5"},
- {file = "pyzmq-25.1.0-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:f0d9e7ba6a815a12c8575ba7887da4b72483e4cfc57179af10c9b937f3f9308f"},
- {file = "pyzmq-25.1.0-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:414b8beec76521358b49170db7b9967d6974bdfc3297f47f7d23edec37329b00"},
- {file = "pyzmq-25.1.0-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:01f06f33e12497dca86353c354461f75275a5ad9eaea181ac0dc1662da8074fa"},
- {file = "pyzmq-25.1.0-cp36-cp36m-win32.whl", hash = "sha256:b5a07c4f29bf7cb0164664ef87e4aa25435dcc1f818d29842118b0ac1eb8e2b5"},
- {file = "pyzmq-25.1.0-cp36-cp36m-win_amd64.whl", hash = "sha256:968b0c737797c1809ec602e082cb63e9824ff2329275336bb88bd71591e94a90"},
- {file = "pyzmq-25.1.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:47b915ba666c51391836d7ed9a745926b22c434efa76c119f77bcffa64d2c50c"},
- {file = "pyzmq-25.1.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5af31493663cf76dd36b00dafbc839e83bbca8a0662931e11816d75f36155897"},
- {file = "pyzmq-25.1.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:5489738a692bc7ee9a0a7765979c8a572520d616d12d949eaffc6e061b82b4d1"},
- {file = "pyzmq-25.1.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:1fc56a0221bdf67cfa94ef2d6ce5513a3d209c3dfd21fed4d4e87eca1822e3a3"},
- {file = "pyzmq-25.1.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:75217e83faea9edbc29516fc90c817bc40c6b21a5771ecb53e868e45594826b0"},
- {file = "pyzmq-25.1.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:3830be8826639d801de9053cf86350ed6742c4321ba4236e4b5568528d7bfed7"},
- {file = "pyzmq-25.1.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:3575699d7fd7c9b2108bc1c6128641a9a825a58577775ada26c02eb29e09c517"},
- {file = "pyzmq-25.1.0-cp37-cp37m-win32.whl", hash = "sha256:95bd3a998d8c68b76679f6b18f520904af5204f089beebb7b0301d97704634dd"},
- {file = "pyzmq-25.1.0-cp37-cp37m-win_amd64.whl", hash = "sha256:dbc466744a2db4b7ca05589f21ae1a35066afada2f803f92369f5877c100ef62"},
- {file = "pyzmq-25.1.0-cp38-cp38-macosx_10_15_universal2.whl", hash = "sha256:3bed53f7218490c68f0e82a29c92335daa9606216e51c64f37b48eb78f1281f4"},
- {file = "pyzmq-25.1.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:eb52e826d16c09ef87132c6e360e1879c984f19a4f62d8a935345deac43f3c12"},
- {file = "pyzmq-25.1.0-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:ddbef8b53cd16467fdbfa92a712eae46dd066aa19780681a2ce266e88fbc7165"},
- {file = "pyzmq-25.1.0-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:9301cf1d7fc1ddf668d0abbe3e227fc9ab15bc036a31c247276012abb921b5ff"},
- {file = "pyzmq-25.1.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7e23a8c3b6c06de40bdb9e06288180d630b562db8ac199e8cc535af81f90e64b"},
- {file = "pyzmq-25.1.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:4a82faae00d1eed4809c2f18b37f15ce39a10a1c58fe48b60ad02875d6e13d80"},
- {file = "pyzmq-25.1.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:c8398a1b1951aaa330269c35335ae69744be166e67e0ebd9869bdc09426f3871"},
- {file = "pyzmq-25.1.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:d40682ac60b2a613d36d8d3a0cd14fbdf8e7e0618fbb40aa9fa7b796c9081584"},
- {file = "pyzmq-25.1.0-cp38-cp38-win32.whl", hash = "sha256:33d5c8391a34d56224bccf74f458d82fc6e24b3213fc68165c98b708c7a69325"},
- {file = "pyzmq-25.1.0-cp38-cp38-win_amd64.whl", hash = "sha256:c66b7ff2527e18554030319b1376d81560ca0742c6e0b17ff1ee96624a5f1afd"},
- {file = "pyzmq-25.1.0-cp39-cp39-macosx_10_15_universal2.whl", hash = "sha256:af56229ea6527a849ac9fb154a059d7e32e77a8cba27e3e62a1e38d8808cb1a5"},
- {file = "pyzmq-25.1.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:bdca18b94c404af6ae5533cd1bc310c4931f7ac97c148bbfd2cd4bdd62b96253"},
- {file = "pyzmq-25.1.0-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:0b6b42f7055bbc562f63f3df3b63e3dd1ebe9727ff0f124c3aa7bcea7b3a00f9"},
- {file = "pyzmq-25.1.0-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:4c2fc7aad520a97d64ffc98190fce6b64152bde57a10c704b337082679e74f67"},
- {file = "pyzmq-25.1.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:be86a26415a8b6af02cd8d782e3a9ae3872140a057f1cadf0133de685185c02b"},
- {file = "pyzmq-25.1.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:851fb2fe14036cfc1960d806628b80276af5424db09fe5c91c726890c8e6d943"},
- {file = "pyzmq-25.1.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:2a21fec5c3cea45421a19ccbe6250c82f97af4175bc09de4d6dd78fb0cb4c200"},
- {file = "pyzmq-25.1.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:bad172aba822444b32eae54c2d5ab18cd7dee9814fd5c7ed026603b8cae2d05f"},
- {file = "pyzmq-25.1.0-cp39-cp39-win32.whl", hash = "sha256:4d67609b37204acad3d566bb7391e0ecc25ef8bae22ff72ebe2ad7ffb7847158"},
- {file = "pyzmq-25.1.0-cp39-cp39-win_amd64.whl", hash = "sha256:71c7b5896e40720d30cd77a81e62b433b981005bbff0cb2f739e0f8d059b5d99"},
- {file = "pyzmq-25.1.0-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:4cb27ef9d3bdc0c195b2dc54fcb8720e18b741624686a81942e14c8b67cc61a6"},
- {file = "pyzmq-25.1.0-pp37-pypy37_pp73-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:0c4fc2741e0513b5d5a12fe200d6785bbcc621f6f2278893a9ca7bed7f2efb7d"},
- {file = "pyzmq-25.1.0-pp37-pypy37_pp73-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:fc34fdd458ff77a2a00e3c86f899911f6f269d393ca5675842a6e92eea565bae"},
- {file = "pyzmq-25.1.0-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8751f9c1442624da391bbd92bd4b072def6d7702a9390e4479f45c182392ff78"},
- {file = "pyzmq-25.1.0-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:6581e886aec3135964a302a0f5eb68f964869b9efd1dbafdebceaaf2934f8a68"},
- {file = "pyzmq-25.1.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:5482f08d2c3c42b920e8771ae8932fbaa0a67dff925fc476996ddd8155a170f3"},
- {file = "pyzmq-25.1.0-pp38-pypy38_pp73-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:5e7fbcafa3ea16d1de1f213c226005fea21ee16ed56134b75b2dede5a2129e62"},
- {file = "pyzmq-25.1.0-pp38-pypy38_pp73-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:adecf6d02b1beab8d7c04bc36f22bb0e4c65a35eb0b4750b91693631d4081c70"},
- {file = "pyzmq-25.1.0-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f6d39e42a0aa888122d1beb8ec0d4ddfb6c6b45aecb5ba4013c27e2f28657765"},
- {file = "pyzmq-25.1.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:7018289b402ebf2b2c06992813523de61d4ce17bd514c4339d8f27a6f6809492"},
- {file = "pyzmq-25.1.0-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:9e68ae9864d260b18f311b68d29134d8776d82e7f5d75ce898b40a88df9db30f"},
- {file = "pyzmq-25.1.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e21cc00e4debe8f54c3ed7b9fcca540f46eee12762a9fa56feb8512fd9057161"},
- {file = "pyzmq-25.1.0-pp39-pypy39_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2f666ae327a6899ff560d741681fdcdf4506f990595201ed39b44278c471ad98"},
- {file = "pyzmq-25.1.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2f5efcc29056dfe95e9c9db0dfbb12b62db9c4ad302f812931b6d21dd04a9119"},
- {file = "pyzmq-25.1.0-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:48e5e59e77c1a83162ab3c163fc01cd2eebc5b34560341a67421b09be0891287"},
- {file = "pyzmq-25.1.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:108c96ebbd573d929740d66e4c3d1bdf31d5cde003b8dc7811a3c8c5b0fc173b"},
- {file = "pyzmq-25.1.0.tar.gz", hash = "sha256:80c41023465d36280e801564a69cbfce8ae85ff79b080e1913f6e90481fb8957"},
+ {file = "pyzmq-25.1.1-cp310-cp310-macosx_10_15_universal2.whl", hash = "sha256:381469297409c5adf9a0e884c5eb5186ed33137badcbbb0560b86e910a2f1e76"},
+ {file = "pyzmq-25.1.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:955215ed0604dac5b01907424dfa28b40f2b2292d6493445dd34d0dfa72586a8"},
+ {file = "pyzmq-25.1.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:985bbb1316192b98f32e25e7b9958088431d853ac63aca1d2c236f40afb17c83"},
+ {file = "pyzmq-25.1.1-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:afea96f64efa98df4da6958bae37f1cbea7932c35878b185e5982821bc883369"},
+ {file = "pyzmq-25.1.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:76705c9325d72a81155bb6ab48d4312e0032bf045fb0754889133200f7a0d849"},
+ {file = "pyzmq-25.1.1-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:77a41c26205d2353a4c94d02be51d6cbdf63c06fbc1295ea57dad7e2d3381b71"},
+ {file = "pyzmq-25.1.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:12720a53e61c3b99d87262294e2b375c915fea93c31fc2336898c26d7aed34cd"},
+ {file = "pyzmq-25.1.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:57459b68e5cd85b0be8184382cefd91959cafe79ae019e6b1ae6e2ba8a12cda7"},
+ {file = "pyzmq-25.1.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:292fe3fc5ad4a75bc8df0dfaee7d0babe8b1f4ceb596437213821f761b4589f9"},
+ {file = "pyzmq-25.1.1-cp310-cp310-win32.whl", hash = "sha256:35b5ab8c28978fbbb86ea54958cd89f5176ce747c1fb3d87356cf698048a7790"},
+ {file = "pyzmq-25.1.1-cp310-cp310-win_amd64.whl", hash = "sha256:11baebdd5fc5b475d484195e49bae2dc64b94a5208f7c89954e9e354fc609d8f"},
+ {file = "pyzmq-25.1.1-cp311-cp311-macosx_10_15_universal2.whl", hash = "sha256:d20a0ddb3e989e8807d83225a27e5c2eb2260eaa851532086e9e0fa0d5287d83"},
+ {file = "pyzmq-25.1.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:e1c1be77bc5fb77d923850f82e55a928f8638f64a61f00ff18a67c7404faf008"},
+ {file = "pyzmq-25.1.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d89528b4943d27029a2818f847c10c2cecc79fa9590f3cb1860459a5be7933eb"},
+ {file = "pyzmq-25.1.1-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:90f26dc6d5f241ba358bef79be9ce06de58d477ca8485e3291675436d3827cf8"},
+ {file = "pyzmq-25.1.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c2b92812bd214018e50b6380ea3ac0c8bb01ac07fcc14c5f86a5bb25e74026e9"},
+ {file = "pyzmq-25.1.1-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:2f957ce63d13c28730f7fd6b72333814221c84ca2421298f66e5143f81c9f91f"},
+ {file = "pyzmq-25.1.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:047a640f5c9c6ade7b1cc6680a0e28c9dd5a0825135acbd3569cc96ea00b2505"},
+ {file = "pyzmq-25.1.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:7f7e58effd14b641c5e4dec8c7dab02fb67a13df90329e61c869b9cc607ef752"},
+ {file = "pyzmq-25.1.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:c2910967e6ab16bf6fbeb1f771c89a7050947221ae12a5b0b60f3bca2ee19bca"},
+ {file = "pyzmq-25.1.1-cp311-cp311-win32.whl", hash = "sha256:76c1c8efb3ca3a1818b837aea423ff8a07bbf7aafe9f2f6582b61a0458b1a329"},
+ {file = "pyzmq-25.1.1-cp311-cp311-win_amd64.whl", hash = "sha256:44e58a0554b21fc662f2712814a746635ed668d0fbc98b7cb9d74cb798d202e6"},
+ {file = "pyzmq-25.1.1-cp312-cp312-macosx_10_15_universal2.whl", hash = "sha256:e1ffa1c924e8c72778b9ccd386a7067cddf626884fd8277f503c48bb5f51c762"},
+ {file = "pyzmq-25.1.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:1af379b33ef33757224da93e9da62e6471cf4a66d10078cf32bae8127d3d0d4a"},
+ {file = "pyzmq-25.1.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cff084c6933680d1f8b2f3b4ff5bbb88538a4aac00d199ac13f49d0698727ecb"},
+ {file = "pyzmq-25.1.1-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e2400a94f7dd9cb20cd012951a0cbf8249e3d554c63a9c0cdfd5cbb6c01d2dec"},
+ {file = "pyzmq-25.1.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2d81f1ddae3858b8299d1da72dd7d19dd36aab654c19671aa8a7e7fb02f6638a"},
+ {file = "pyzmq-25.1.1-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:255ca2b219f9e5a3a9ef3081512e1358bd4760ce77828e1028b818ff5610b87b"},
+ {file = "pyzmq-25.1.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:a882ac0a351288dd18ecae3326b8a49d10c61a68b01419f3a0b9a306190baf69"},
+ {file = "pyzmq-25.1.1-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:724c292bb26365659fc434e9567b3f1adbdb5e8d640c936ed901f49e03e5d32e"},
+ {file = "pyzmq-25.1.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4ca1ed0bb2d850aa8471387882247c68f1e62a4af0ce9c8a1dbe0d2bf69e41fb"},
+ {file = "pyzmq-25.1.1-cp312-cp312-win32.whl", hash = "sha256:b3451108ab861040754fa5208bca4a5496c65875710f76789a9ad27c801a0075"},
+ {file = "pyzmq-25.1.1-cp312-cp312-win_amd64.whl", hash = "sha256:eadbefd5e92ef8a345f0525b5cfd01cf4e4cc651a2cffb8f23c0dd184975d787"},
+ {file = "pyzmq-25.1.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:db0b2af416ba735c6304c47f75d348f498b92952f5e3e8bff449336d2728795d"},
+ {file = "pyzmq-25.1.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c7c133e93b405eb0d36fa430c94185bdd13c36204a8635470cccc200723c13bb"},
+ {file = "pyzmq-25.1.1-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:273bc3959bcbff3f48606b28229b4721716598d76b5aaea2b4a9d0ab454ec062"},
+ {file = "pyzmq-25.1.1-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:cbc8df5c6a88ba5ae385d8930da02201165408dde8d8322072e3e5ddd4f68e22"},
+ {file = "pyzmq-25.1.1-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:18d43df3f2302d836f2a56f17e5663e398416e9dd74b205b179065e61f1a6edf"},
+ {file = "pyzmq-25.1.1-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:73461eed88a88c866656e08f89299720a38cb4e9d34ae6bf5df6f71102570f2e"},
+ {file = "pyzmq-25.1.1-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:34c850ce7976d19ebe7b9d4b9bb8c9dfc7aac336c0958e2651b88cbd46682123"},
+ {file = "pyzmq-25.1.1-cp36-cp36m-win32.whl", hash = "sha256:d2045d6d9439a0078f2a34b57c7b18c4a6aef0bee37f22e4ec9f32456c852c71"},
+ {file = "pyzmq-25.1.1-cp36-cp36m-win_amd64.whl", hash = "sha256:458dea649f2f02a0b244ae6aef8dc29325a2810aa26b07af8374dc2a9faf57e3"},
+ {file = "pyzmq-25.1.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:7cff25c5b315e63b07a36f0c2bab32c58eafbe57d0dce61b614ef4c76058c115"},
+ {file = "pyzmq-25.1.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b1579413ae492b05de5a6174574f8c44c2b9b122a42015c5292afa4be2507f28"},
+ {file = "pyzmq-25.1.1-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:3d0a409d3b28607cc427aa5c30a6f1e4452cc44e311f843e05edb28ab5e36da0"},
+ {file = "pyzmq-25.1.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:21eb4e609a154a57c520e3d5bfa0d97e49b6872ea057b7c85257b11e78068222"},
+ {file = "pyzmq-25.1.1-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:034239843541ef7a1aee0c7b2cb7f6aafffb005ede965ae9cbd49d5ff4ff73cf"},
+ {file = "pyzmq-25.1.1-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:f8115e303280ba09f3898194791a153862cbf9eef722ad8f7f741987ee2a97c7"},
+ {file = "pyzmq-25.1.1-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:1a5d26fe8f32f137e784f768143728438877d69a586ddeaad898558dc971a5ae"},
+ {file = "pyzmq-25.1.1-cp37-cp37m-win32.whl", hash = "sha256:f32260e556a983bc5c7ed588d04c942c9a8f9c2e99213fec11a031e316874c7e"},
+ {file = "pyzmq-25.1.1-cp37-cp37m-win_amd64.whl", hash = "sha256:abf34e43c531bbb510ae7e8f5b2b1f2a8ab93219510e2b287a944432fad135f3"},
+ {file = "pyzmq-25.1.1-cp38-cp38-macosx_10_15_universal2.whl", hash = "sha256:87e34f31ca8f168c56d6fbf99692cc8d3b445abb5bfd08c229ae992d7547a92a"},
+ {file = "pyzmq-25.1.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:c9c6c9b2c2f80747a98f34ef491c4d7b1a8d4853937bb1492774992a120f475d"},
+ {file = "pyzmq-25.1.1-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:5619f3f5a4db5dbb572b095ea3cb5cc035335159d9da950830c9c4db2fbb6995"},
+ {file = "pyzmq-25.1.1-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:5a34d2395073ef862b4032343cf0c32a712f3ab49d7ec4f42c9661e0294d106f"},
+ {file = "pyzmq-25.1.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:25f0e6b78220aba09815cd1f3a32b9c7cb3e02cb846d1cfc526b6595f6046618"},
+ {file = "pyzmq-25.1.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:3669cf8ee3520c2f13b2e0351c41fea919852b220988d2049249db10046a7afb"},
+ {file = "pyzmq-25.1.1-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:2d163a18819277e49911f7461567bda923461c50b19d169a062536fffe7cd9d2"},
+ {file = "pyzmq-25.1.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:df27ffddff4190667d40de7beba4a950b5ce78fe28a7dcc41d6f8a700a80a3c0"},
+ {file = "pyzmq-25.1.1-cp38-cp38-win32.whl", hash = "sha256:a382372898a07479bd34bda781008e4a954ed8750f17891e794521c3e21c2e1c"},
+ {file = "pyzmq-25.1.1-cp38-cp38-win_amd64.whl", hash = "sha256:52533489f28d62eb1258a965f2aba28a82aa747202c8fa5a1c7a43b5db0e85c1"},
+ {file = "pyzmq-25.1.1-cp39-cp39-macosx_10_15_universal2.whl", hash = "sha256:03b3f49b57264909aacd0741892f2aecf2f51fb053e7d8ac6767f6c700832f45"},
+ {file = "pyzmq-25.1.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:330f9e188d0d89080cde66dc7470f57d1926ff2fb5576227f14d5be7ab30b9fa"},
+ {file = "pyzmq-25.1.1-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:2ca57a5be0389f2a65e6d3bb2962a971688cbdd30b4c0bd188c99e39c234f414"},
+ {file = "pyzmq-25.1.1-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:d457aed310f2670f59cc5b57dcfced452aeeed77f9da2b9763616bd57e4dbaae"},
+ {file = "pyzmq-25.1.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c56d748ea50215abef7030c72b60dd723ed5b5c7e65e7bc2504e77843631c1a6"},
+ {file = "pyzmq-25.1.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:8f03d3f0d01cb5a018debeb412441996a517b11c5c17ab2001aa0597c6d6882c"},
+ {file = "pyzmq-25.1.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:820c4a08195a681252f46926de10e29b6bbf3e17b30037bd4250d72dd3ddaab8"},
+ {file = "pyzmq-25.1.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:17ef5f01d25b67ca8f98120d5fa1d21efe9611604e8eb03a5147360f517dd1e2"},
+ {file = "pyzmq-25.1.1-cp39-cp39-win32.whl", hash = "sha256:04ccbed567171579ec2cebb9c8a3e30801723c575601f9a990ab25bcac6b51e2"},
+ {file = "pyzmq-25.1.1-cp39-cp39-win_amd64.whl", hash = "sha256:e61f091c3ba0c3578411ef505992d356a812fb200643eab27f4f70eed34a29ef"},
+ {file = "pyzmq-25.1.1-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:ade6d25bb29c4555d718ac6d1443a7386595528c33d6b133b258f65f963bb0f6"},
+ {file = "pyzmq-25.1.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e0c95ddd4f6e9fca4e9e3afaa4f9df8552f0ba5d1004e89ef0a68e1f1f9807c7"},
+ {file = "pyzmq-25.1.1-pp310-pypy310_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:48e466162a24daf86f6b5ca72444d2bf39a5e58da5f96370078be67c67adc978"},
+ {file = "pyzmq-25.1.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:abc719161780932c4e11aaebb203be3d6acc6b38d2f26c0f523b5b59d2fc1996"},
+ {file = "pyzmq-25.1.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:1ccf825981640b8c34ae54231b7ed00271822ea1c6d8ba1090ebd4943759abf5"},
+ {file = "pyzmq-25.1.1-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:c2f20ce161ebdb0091a10c9ca0372e023ce24980d0e1f810f519da6f79c60800"},
+ {file = "pyzmq-25.1.1-pp37-pypy37_pp73-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:deee9ca4727f53464daf089536e68b13e6104e84a37820a88b0a057b97bba2d2"},
+ {file = "pyzmq-25.1.1-pp37-pypy37_pp73-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:aa8d6cdc8b8aa19ceb319aaa2b660cdaccc533ec477eeb1309e2a291eaacc43a"},
+ {file = "pyzmq-25.1.1-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:019e59ef5c5256a2c7378f2fb8560fc2a9ff1d315755204295b2eab96b254d0a"},
+ {file = "pyzmq-25.1.1-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:b9af3757495c1ee3b5c4e945c1df7be95562277c6e5bccc20a39aec50f826cd0"},
+ {file = "pyzmq-25.1.1-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:548d6482dc8aadbe7e79d1b5806585c8120bafa1ef841167bc9090522b610fa6"},
+ {file = "pyzmq-25.1.1-pp38-pypy38_pp73-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:057e824b2aae50accc0f9a0570998adc021b372478a921506fddd6c02e60308e"},
+ {file = "pyzmq-25.1.1-pp38-pypy38_pp73-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:2243700cc5548cff20963f0ca92d3e5e436394375ab8a354bbea2b12911b20b0"},
+ {file = "pyzmq-25.1.1-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:79986f3b4af059777111409ee517da24a529bdbd46da578b33f25580adcff728"},
+ {file = "pyzmq-25.1.1-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:11d58723d44d6ed4dd677c5615b2ffb19d5c426636345567d6af82be4dff8a55"},
+ {file = "pyzmq-25.1.1-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:49d238cf4b69652257db66d0c623cd3e09b5d2e9576b56bc067a396133a00d4a"},
+ {file = "pyzmq-25.1.1-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fedbdc753827cf014c01dbbee9c3be17e5a208dcd1bf8641ce2cd29580d1f0d4"},
+ {file = "pyzmq-25.1.1-pp39-pypy39_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bc16ac425cc927d0a57d242589f87ee093884ea4804c05a13834d07c20db203c"},
+ {file = "pyzmq-25.1.1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:11c1d2aed9079c6b0c9550a7257a836b4a637feb334904610f06d70eb44c56d2"},
+ {file = "pyzmq-25.1.1-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:e8a701123029cc240cea61dd2d16ad57cab4691804143ce80ecd9286b464d180"},
+ {file = "pyzmq-25.1.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:61706a6b6c24bdece85ff177fec393545a3191eeda35b07aaa1458a027ad1304"},
+ {file = "pyzmq-25.1.1.tar.gz", hash = "sha256:259c22485b71abacdfa8bf79720cd7bcf4b9d128b30ea554f01ae71fdbfdaa23"},
]
[package.dependencies]
@@ -5643,14 +5565,13 @@ cffi = {version = "*", markers = "implementation_name == \"pypy\""}
[[package]]
name = "qdrant-client"
-version = "1.3.1"
+version = "1.4.0"
description = "Client library for the Qdrant vector search engine"
-category = "main"
optional = false
python-versions = ">=3.7,<3.12"
files = [
- {file = "qdrant_client-1.3.1-py3-none-any.whl", hash = "sha256:9640855585d1f532094e342f07e0f2ef00652a60fc5d903c92ca3989a1e86318"},
- {file = "qdrant_client-1.3.1.tar.gz", hash = "sha256:a999358b10e611d71b4b04c6ded36a6cfc963e56b4c3f99d9c1a603ca524a82e"},
+ {file = "qdrant_client-1.4.0-py3-none-any.whl", hash = "sha256:2f9e563955b5163da98016f2ed38d9aea5058576c7c5844e9aa205d28155f56d"},
+ {file = "qdrant_client-1.4.0.tar.gz", hash = "sha256:2e54f5a80eb1e7e67f4603b76365af4817af15fb3d0c0f44de4fd93afbbe5537"},
]
[package.dependencies]
@@ -5659,15 +5580,13 @@ grpcio-tools = ">=1.41.0"
httpx = {version = ">=0.14.0", extras = ["http2"]}
numpy = {version = ">=1.21", markers = "python_version >= \"3.8\""}
portalocker = ">=2.7.0,<3.0.0"
-pydantic = ">=1.8,<2.0"
-typing-extensions = ">=4.0.0,<4.6.0"
+pydantic = ">=1.10.8"
urllib3 = ">=1.26.14,<2.0.0"
[[package]]
name = "readme-renderer"
version = "40.0"
description = "readme_renderer is a library for rendering \"readme\" descriptions for Warehouse"
-category = "main"
optional = false
python-versions = ">=3.8"
files = [
@@ -5687,7 +5606,6 @@ md = ["cmarkgfm (>=0.8.0)"]
name = "realtime"
version = "1.0.0"
description = ""
-category = "main"
optional = false
python-versions = ">=3.8,<4.0"
files = [
@@ -5702,107 +5620,105 @@ websockets = ">=10.3,<11.0"
[[package]]
name = "regex"
-version = "2023.6.3"
+version = "2023.8.8"
description = "Alternative regular expression module, to replace re."
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
- {file = "regex-2023.6.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:824bf3ac11001849aec3fa1d69abcb67aac3e150a933963fb12bda5151fe1bfd"},
- {file = "regex-2023.6.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:05ed27acdf4465c95826962528f9e8d41dbf9b1aa8531a387dee6ed215a3e9ef"},
- {file = "regex-2023.6.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0b49c764f88a79160fa64f9a7b425620e87c9f46095ef9c9920542ab2495c8bc"},
- {file = "regex-2023.6.3-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8e3f1316c2293e5469f8f09dc2d76efb6c3982d3da91ba95061a7e69489a14ef"},
- {file = "regex-2023.6.3-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:43e1dd9d12df9004246bacb79a0e5886b3b6071b32e41f83b0acbf293f820ee8"},
- {file = "regex-2023.6.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4959e8bcbfda5146477d21c3a8ad81b185cd252f3d0d6e4724a5ef11c012fb06"},
- {file = "regex-2023.6.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:af4dd387354dc83a3bff67127a124c21116feb0d2ef536805c454721c5d7993d"},
- {file = "regex-2023.6.3-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:2239d95d8e243658b8dbb36b12bd10c33ad6e6933a54d36ff053713f129aa536"},
- {file = "regex-2023.6.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:890e5a11c97cf0d0c550eb661b937a1e45431ffa79803b942a057c4fb12a2da2"},
- {file = "regex-2023.6.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:a8105e9af3b029f243ab11ad47c19b566482c150c754e4c717900a798806b222"},
- {file = "regex-2023.6.3-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:25be746a8ec7bc7b082783216de8e9473803706723b3f6bef34b3d0ed03d57e2"},
- {file = "regex-2023.6.3-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:3676f1dd082be28b1266c93f618ee07741b704ab7b68501a173ce7d8d0d0ca18"},
- {file = "regex-2023.6.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:10cb847aeb1728412c666ab2e2000ba6f174f25b2bdc7292e7dd71b16db07568"},
- {file = "regex-2023.6.3-cp310-cp310-win32.whl", hash = "sha256:dbbbfce33cd98f97f6bffb17801b0576e653f4fdb1d399b2ea89638bc8d08ae1"},
- {file = "regex-2023.6.3-cp310-cp310-win_amd64.whl", hash = "sha256:c5f8037000eb21e4823aa485149f2299eb589f8d1fe4b448036d230c3f4e68e0"},
- {file = "regex-2023.6.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:c123f662be8ec5ab4ea72ea300359023a5d1df095b7ead76fedcd8babbedf969"},
- {file = "regex-2023.6.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:9edcbad1f8a407e450fbac88d89e04e0b99a08473f666a3f3de0fd292badb6aa"},
- {file = "regex-2023.6.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dcba6dae7de533c876255317c11f3abe4907ba7d9aa15d13e3d9710d4315ec0e"},
- {file = "regex-2023.6.3-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:29cdd471ebf9e0f2fb3cac165efedc3c58db841d83a518b082077e612d3ee5df"},
- {file = "regex-2023.6.3-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:12b74fbbf6cbbf9dbce20eb9b5879469e97aeeaa874145517563cca4029db65c"},
- {file = "regex-2023.6.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0c29ca1bd61b16b67be247be87390ef1d1ef702800f91fbd1991f5c4421ebae8"},
- {file = "regex-2023.6.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d77f09bc4b55d4bf7cc5eba785d87001d6757b7c9eec237fe2af57aba1a071d9"},
- {file = "regex-2023.6.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:ea353ecb6ab5f7e7d2f4372b1e779796ebd7b37352d290096978fea83c4dba0c"},
- {file = "regex-2023.6.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:10590510780b7541969287512d1b43f19f965c2ece6c9b1c00fc367b29d8dce7"},
- {file = "regex-2023.6.3-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:e2fbd6236aae3b7f9d514312cdb58e6494ee1c76a9948adde6eba33eb1c4264f"},
- {file = "regex-2023.6.3-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:6b2675068c8b56f6bfd5a2bda55b8accbb96c02fd563704732fd1c95e2083461"},
- {file = "regex-2023.6.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:74419d2b50ecb98360cfaa2974da8689cb3b45b9deff0dcf489c0d333bcc1477"},
- {file = "regex-2023.6.3-cp311-cp311-win32.whl", hash = "sha256:fb5ec16523dc573a4b277663a2b5a364e2099902d3944c9419a40ebd56a118f9"},
- {file = "regex-2023.6.3-cp311-cp311-win_amd64.whl", hash = "sha256:09e4a1a6acc39294a36b7338819b10baceb227f7f7dbbea0506d419b5a1dd8af"},
- {file = "regex-2023.6.3-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:0654bca0cdf28a5956c83839162692725159f4cda8d63e0911a2c0dc76166525"},
- {file = "regex-2023.6.3-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:463b6a3ceb5ca952e66550a4532cef94c9a0c80dc156c4cc343041951aec1697"},
- {file = "regex-2023.6.3-cp36-cp36m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:87b2a5bb5e78ee0ad1de71c664d6eb536dc3947a46a69182a90f4410f5e3f7dd"},
- {file = "regex-2023.6.3-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6343c6928282c1f6a9db41f5fd551662310e8774c0e5ebccb767002fcf663ca9"},
- {file = "regex-2023.6.3-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b6192d5af2ccd2a38877bfef086d35e6659566a335b1492786ff254c168b1693"},
- {file = "regex-2023.6.3-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:74390d18c75054947e4194019077e243c06fbb62e541d8817a0fa822ea310c14"},
- {file = "regex-2023.6.3-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:742e19a90d9bb2f4a6cf2862b8b06dea5e09b96c9f2df1779e53432d7275331f"},
- {file = "regex-2023.6.3-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:8abbc5d54ea0ee80e37fef009e3cec5dafd722ed3c829126253d3e22f3846f1e"},
- {file = "regex-2023.6.3-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:c2b867c17a7a7ae44c43ebbeb1b5ff406b3e8d5b3e14662683e5e66e6cc868d3"},
- {file = "regex-2023.6.3-cp36-cp36m-musllinux_1_1_ppc64le.whl", hash = "sha256:d831c2f8ff278179705ca59f7e8524069c1a989e716a1874d6d1aab6119d91d1"},
- {file = "regex-2023.6.3-cp36-cp36m-musllinux_1_1_s390x.whl", hash = "sha256:ee2d1a9a253b1729bb2de27d41f696ae893507c7db224436abe83ee25356f5c1"},
- {file = "regex-2023.6.3-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:61474f0b41fe1a80e8dfa70f70ea1e047387b7cd01c85ec88fa44f5d7561d787"},
- {file = "regex-2023.6.3-cp36-cp36m-win32.whl", hash = "sha256:0b71e63226e393b534105fcbdd8740410dc6b0854c2bfa39bbda6b0d40e59a54"},
- {file = "regex-2023.6.3-cp36-cp36m-win_amd64.whl", hash = "sha256:bbb02fd4462f37060122e5acacec78e49c0fbb303c30dd49c7f493cf21fc5b27"},
- {file = "regex-2023.6.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:b862c2b9d5ae38a68b92e215b93f98d4c5e9454fa36aae4450f61dd33ff48487"},
- {file = "regex-2023.6.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:976d7a304b59ede34ca2921305b57356694f9e6879db323fd90a80f865d355a3"},
- {file = "regex-2023.6.3-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:83320a09188e0e6c39088355d423aa9d056ad57a0b6c6381b300ec1a04ec3d16"},
- {file = "regex-2023.6.3-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9427a399501818a7564f8c90eced1e9e20709ece36be701f394ada99890ea4b3"},
- {file = "regex-2023.6.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7178bbc1b2ec40eaca599d13c092079bf529679bf0371c602edaa555e10b41c3"},
- {file = "regex-2023.6.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:837328d14cde912af625d5f303ec29f7e28cdab588674897baafaf505341f2fc"},
- {file = "regex-2023.6.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:2d44dc13229905ae96dd2ae2dd7cebf824ee92bc52e8cf03dcead37d926da019"},
- {file = "regex-2023.6.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:d54af539295392611e7efbe94e827311eb8b29668e2b3f4cadcfe6f46df9c777"},
- {file = "regex-2023.6.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:7117d10690c38a622e54c432dfbbd3cbd92f09401d622902c32f6d377e2300ee"},
- {file = "regex-2023.6.3-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bb60b503ec8a6e4e3e03a681072fa3a5adcbfa5479fa2d898ae2b4a8e24c4591"},
- {file = "regex-2023.6.3-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:65ba8603753cec91c71de423a943ba506363b0e5c3fdb913ef8f9caa14b2c7e0"},
- {file = "regex-2023.6.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:271f0bdba3c70b58e6f500b205d10a36fb4b58bd06ac61381b68de66442efddb"},
- {file = "regex-2023.6.3-cp37-cp37m-win32.whl", hash = "sha256:9beb322958aaca059f34975b0df135181f2e5d7a13b84d3e0e45434749cb20f7"},
- {file = "regex-2023.6.3-cp37-cp37m-win_amd64.whl", hash = "sha256:fea75c3710d4f31389eed3c02f62d0b66a9da282521075061ce875eb5300cf23"},
- {file = "regex-2023.6.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:8f56fcb7ff7bf7404becdfc60b1e81a6d0561807051fd2f1860b0d0348156a07"},
- {file = "regex-2023.6.3-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:d2da3abc88711bce7557412310dfa50327d5769a31d1c894b58eb256459dc289"},
- {file = "regex-2023.6.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a99b50300df5add73d307cf66abea093304a07eb017bce94f01e795090dea87c"},
- {file = "regex-2023.6.3-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5708089ed5b40a7b2dc561e0c8baa9535b77771b64a8330b684823cfd5116036"},
- {file = "regex-2023.6.3-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:687ea9d78a4b1cf82f8479cab23678aff723108df3edeac098e5b2498879f4a7"},
- {file = "regex-2023.6.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4d3850beab9f527f06ccc94b446c864059c57651b3f911fddb8d9d3ec1d1b25d"},
- {file = "regex-2023.6.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e8915cc96abeb8983cea1df3c939e3c6e1ac778340c17732eb63bb96247b91d2"},
- {file = "regex-2023.6.3-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:841d6e0e5663d4c7b4c8099c9997be748677d46cbf43f9f471150e560791f7ff"},
- {file = "regex-2023.6.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:9edce5281f965cf135e19840f4d93d55b3835122aa76ccacfd389e880ba4cf82"},
- {file = "regex-2023.6.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:b956231ebdc45f5b7a2e1f90f66a12be9610ce775fe1b1d50414aac1e9206c06"},
- {file = "regex-2023.6.3-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:36efeba71c6539d23c4643be88295ce8c82c88bbd7c65e8a24081d2ca123da3f"},
- {file = "regex-2023.6.3-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:cf67ca618b4fd34aee78740bea954d7c69fdda419eb208c2c0c7060bb822d747"},
- {file = "regex-2023.6.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:b4598b1897837067a57b08147a68ac026c1e73b31ef6e36deeeb1fa60b2933c9"},
- {file = "regex-2023.6.3-cp38-cp38-win32.whl", hash = "sha256:f415f802fbcafed5dcc694c13b1292f07fe0befdb94aa8a52905bd115ff41e88"},
- {file = "regex-2023.6.3-cp38-cp38-win_amd64.whl", hash = "sha256:d4f03bb71d482f979bda92e1427f3ec9b220e62a7dd337af0aa6b47bf4498f72"},
- {file = "regex-2023.6.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:ccf91346b7bd20c790310c4147eee6ed495a54ddb6737162a36ce9dbef3e4751"},
- {file = "regex-2023.6.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:b28f5024a3a041009eb4c333863d7894d191215b39576535c6734cd88b0fcb68"},
- {file = "regex-2023.6.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e0bb18053dfcfed432cc3ac632b5e5e5c5b7e55fb3f8090e867bfd9b054dbcbf"},
- {file = "regex-2023.6.3-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9a5bfb3004f2144a084a16ce19ca56b8ac46e6fd0651f54269fc9e230edb5e4a"},
- {file = "regex-2023.6.3-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5c6b48d0fa50d8f4df3daf451be7f9689c2bde1a52b1225c5926e3f54b6a9ed1"},
- {file = "regex-2023.6.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:051da80e6eeb6e239e394ae60704d2b566aa6a7aed6f2890a7967307267a5dc6"},
- {file = "regex-2023.6.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a4c3b7fa4cdaa69268748665a1a6ff70c014d39bb69c50fda64b396c9116cf77"},
- {file = "regex-2023.6.3-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:457b6cce21bee41ac292d6753d5e94dcbc5c9e3e3a834da285b0bde7aa4a11e9"},
- {file = "regex-2023.6.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:aad51907d74fc183033ad796dd4c2e080d1adcc4fd3c0fd4fd499f30c03011cd"},
- {file = "regex-2023.6.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:0385e73da22363778ef2324950e08b689abdf0b108a7d8decb403ad7f5191938"},
- {file = "regex-2023.6.3-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:c6a57b742133830eec44d9b2290daf5cbe0a2f1d6acee1b3c7b1c7b2f3606df7"},
- {file = "regex-2023.6.3-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:3e5219bf9e75993d73ab3d25985c857c77e614525fac9ae02b1bebd92f7cecac"},
- {file = "regex-2023.6.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:e5087a3c59eef624a4591ef9eaa6e9a8d8a94c779dade95d27c0bc24650261cd"},
- {file = "regex-2023.6.3-cp39-cp39-win32.whl", hash = "sha256:20326216cc2afe69b6e98528160b225d72f85ab080cbdf0b11528cbbaba2248f"},
- {file = "regex-2023.6.3-cp39-cp39-win_amd64.whl", hash = "sha256:bdff5eab10e59cf26bc479f565e25ed71a7d041d1ded04ccf9aee1d9f208487a"},
- {file = "regex-2023.6.3.tar.gz", hash = "sha256:72d1a25bf36d2050ceb35b517afe13864865268dfb45910e2e17a84be6cbfeb0"},
+ {file = "regex-2023.8.8-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:88900f521c645f784260a8d346e12a1590f79e96403971241e64c3a265c8ecdb"},
+ {file = "regex-2023.8.8-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:3611576aff55918af2697410ff0293d6071b7e00f4b09e005d614686ac4cd57c"},
+ {file = "regex-2023.8.8-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b8a0ccc8f2698f120e9e5742f4b38dc944c38744d4bdfc427616f3a163dd9de5"},
+ {file = "regex-2023.8.8-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c662a4cbdd6280ee56f841f14620787215a171c4e2d1744c9528bed8f5816c96"},
+ {file = "regex-2023.8.8-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cf0633e4a1b667bfe0bb10b5e53fe0d5f34a6243ea2530eb342491f1adf4f739"},
+ {file = "regex-2023.8.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:551ad543fa19e94943c5b2cebc54c73353ffff08228ee5f3376bd27b3d5b9800"},
+ {file = "regex-2023.8.8-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:54de2619f5ea58474f2ac211ceea6b615af2d7e4306220d4f3fe690c91988a61"},
+ {file = "regex-2023.8.8-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:5ec4b3f0aebbbe2fc0134ee30a791af522a92ad9f164858805a77442d7d18570"},
+ {file = "regex-2023.8.8-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:3ae646c35cb9f820491760ac62c25b6d6b496757fda2d51be429e0e7b67ae0ab"},
+ {file = "regex-2023.8.8-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:ca339088839582d01654e6f83a637a4b8194d0960477b9769d2ff2cfa0fa36d2"},
+ {file = "regex-2023.8.8-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:d9b6627408021452dcd0d2cdf8da0534e19d93d070bfa8b6b4176f99711e7f90"},
+ {file = "regex-2023.8.8-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:bd3366aceedf274f765a3a4bc95d6cd97b130d1dda524d8f25225d14123c01db"},
+ {file = "regex-2023.8.8-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:7aed90a72fc3654fba9bc4b7f851571dcc368120432ad68b226bd593f3f6c0b7"},
+ {file = "regex-2023.8.8-cp310-cp310-win32.whl", hash = "sha256:80b80b889cb767cc47f31d2b2f3dec2db8126fbcd0cff31b3925b4dc6609dcdb"},
+ {file = "regex-2023.8.8-cp310-cp310-win_amd64.whl", hash = "sha256:b82edc98d107cbc7357da7a5a695901b47d6eb0420e587256ba3ad24b80b7d0b"},
+ {file = "regex-2023.8.8-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:1e7d84d64c84ad97bf06f3c8cb5e48941f135ace28f450d86af6b6512f1c9a71"},
+ {file = "regex-2023.8.8-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:ce0f9fbe7d295f9922c0424a3637b88c6c472b75eafeaff6f910494a1fa719ef"},
+ {file = "regex-2023.8.8-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:06c57e14ac723b04458df5956cfb7e2d9caa6e9d353c0b4c7d5d54fcb1325c46"},
+ {file = "regex-2023.8.8-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e7a9aaa5a1267125eef22cef3b63484c3241aaec6f48949b366d26c7250e0357"},
+ {file = "regex-2023.8.8-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9b7408511fca48a82a119d78a77c2f5eb1b22fe88b0d2450ed0756d194fe7a9a"},
+ {file = "regex-2023.8.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:14dc6f2d88192a67d708341f3085df6a4f5a0c7b03dec08d763ca2cd86e9f559"},
+ {file = "regex-2023.8.8-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:48c640b99213643d141550326f34f0502fedb1798adb3c9eb79650b1ecb2f177"},
+ {file = "regex-2023.8.8-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:0085da0f6c6393428bf0d9c08d8b1874d805bb55e17cb1dfa5ddb7cfb11140bf"},
+ {file = "regex-2023.8.8-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:964b16dcc10c79a4a2be9f1273fcc2684a9eedb3906439720598029a797b46e6"},
+ {file = "regex-2023.8.8-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:7ce606c14bb195b0e5108544b540e2c5faed6843367e4ab3deb5c6aa5e681208"},
+ {file = "regex-2023.8.8-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:40f029d73b10fac448c73d6eb33d57b34607f40116e9f6e9f0d32e9229b147d7"},
+ {file = "regex-2023.8.8-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:3b8e6ea6be6d64104d8e9afc34c151926f8182f84e7ac290a93925c0db004bfd"},
+ {file = "regex-2023.8.8-cp311-cp311-win32.whl", hash = "sha256:942f8b1f3b223638b02df7df79140646c03938d488fbfb771824f3d05fc083a8"},
+ {file = "regex-2023.8.8-cp311-cp311-win_amd64.whl", hash = "sha256:51d8ea2a3a1a8fe4f67de21b8b93757005213e8ac3917567872f2865185fa7fb"},
+ {file = "regex-2023.8.8-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:e951d1a8e9963ea51efd7f150450803e3b95db5939f994ad3d5edac2b6f6e2b4"},
+ {file = "regex-2023.8.8-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:704f63b774218207b8ccc6c47fcef5340741e5d839d11d606f70af93ee78e4d4"},
+ {file = "regex-2023.8.8-cp36-cp36m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:22283c769a7b01c8ac355d5be0715bf6929b6267619505e289f792b01304d898"},
+ {file = "regex-2023.8.8-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:91129ff1bb0619bc1f4ad19485718cc623a2dc433dff95baadbf89405c7f6b57"},
+ {file = "regex-2023.8.8-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:de35342190deb7b866ad6ba5cbcccb2d22c0487ee0cbb251efef0843d705f0d4"},
+ {file = "regex-2023.8.8-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b993b6f524d1e274a5062488a43e3f9f8764ee9745ccd8e8193df743dbe5ee61"},
+ {file = "regex-2023.8.8-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:3026cbcf11d79095a32d9a13bbc572a458727bd5b1ca332df4a79faecd45281c"},
+ {file = "regex-2023.8.8-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:293352710172239bf579c90a9864d0df57340b6fd21272345222fb6371bf82b3"},
+ {file = "regex-2023.8.8-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:d909b5a3fff619dc7e48b6b1bedc2f30ec43033ba7af32f936c10839e81b9217"},
+ {file = "regex-2023.8.8-cp36-cp36m-musllinux_1_1_ppc64le.whl", hash = "sha256:3d370ff652323c5307d9c8e4c62efd1956fb08051b0e9210212bc51168b4ff56"},
+ {file = "regex-2023.8.8-cp36-cp36m-musllinux_1_1_s390x.whl", hash = "sha256:b076da1ed19dc37788f6a934c60adf97bd02c7eea461b73730513921a85d4235"},
+ {file = "regex-2023.8.8-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:e9941a4ada58f6218694f382e43fdd256e97615db9da135e77359da257a7168b"},
+ {file = "regex-2023.8.8-cp36-cp36m-win32.whl", hash = "sha256:a8c65c17aed7e15a0c824cdc63a6b104dfc530f6fa8cb6ac51c437af52b481c7"},
+ {file = "regex-2023.8.8-cp36-cp36m-win_amd64.whl", hash = "sha256:aadf28046e77a72f30dcc1ab185639e8de7f4104b8cb5c6dfa5d8ed860e57236"},
+ {file = "regex-2023.8.8-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:423adfa872b4908843ac3e7a30f957f5d5282944b81ca0a3b8a7ccbbfaa06103"},
+ {file = "regex-2023.8.8-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4ae594c66f4a7e1ea67232a0846649a7c94c188d6c071ac0210c3e86a5f92109"},
+ {file = "regex-2023.8.8-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e51c80c168074faa793685656c38eb7a06cbad7774c8cbc3ea05552d615393d8"},
+ {file = "regex-2023.8.8-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:09b7f4c66aa9d1522b06e31a54f15581c37286237208df1345108fcf4e050c18"},
+ {file = "regex-2023.8.8-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2e73e5243af12d9cd6a9d6a45a43570dbe2e5b1cdfc862f5ae2b031e44dd95a8"},
+ {file = "regex-2023.8.8-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:941460db8fe3bd613db52f05259c9336f5a47ccae7d7def44cc277184030a116"},
+ {file = "regex-2023.8.8-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:f0ccf3e01afeb412a1a9993049cb160d0352dba635bbca7762b2dc722aa5742a"},
+ {file = "regex-2023.8.8-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:2e9216e0d2cdce7dbc9be48cb3eacb962740a09b011a116fd7af8c832ab116ca"},
+ {file = "regex-2023.8.8-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:5cd9cd7170459b9223c5e592ac036e0704bee765706445c353d96f2890e816c8"},
+ {file = "regex-2023.8.8-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:4873ef92e03a4309b3ccd8281454801b291b689f6ad45ef8c3658b6fa761d7ac"},
+ {file = "regex-2023.8.8-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:239c3c2a339d3b3ddd51c2daef10874410917cd2b998f043c13e2084cb191684"},
+ {file = "regex-2023.8.8-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:1005c60ed7037be0d9dea1f9c53cc42f836188227366370867222bda4c3c6bd7"},
+ {file = "regex-2023.8.8-cp37-cp37m-win32.whl", hash = "sha256:e6bd1e9b95bc5614a7a9c9c44fde9539cba1c823b43a9f7bc11266446dd568e3"},
+ {file = "regex-2023.8.8-cp37-cp37m-win_amd64.whl", hash = "sha256:9a96edd79661e93327cfeac4edec72a4046e14550a1d22aa0dd2e3ca52aec921"},
+ {file = "regex-2023.8.8-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:f2181c20ef18747d5f4a7ea513e09ea03bdd50884a11ce46066bb90fe4213675"},
+ {file = "regex-2023.8.8-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:a2ad5add903eb7cdde2b7c64aaca405f3957ab34f16594d2b78d53b8b1a6a7d6"},
+ {file = "regex-2023.8.8-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9233ac249b354c54146e392e8a451e465dd2d967fc773690811d3a8c240ac601"},
+ {file = "regex-2023.8.8-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:920974009fb37b20d32afcdf0227a2e707eb83fe418713f7a8b7de038b870d0b"},
+ {file = "regex-2023.8.8-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd2b6c5dfe0929b6c23dde9624483380b170b6e34ed79054ad131b20203a1a63"},
+ {file = "regex-2023.8.8-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:96979d753b1dc3b2169003e1854dc67bfc86edf93c01e84757927f810b8c3c93"},
+ {file = "regex-2023.8.8-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2ae54a338191e1356253e7883d9d19f8679b6143703086245fb14d1f20196be9"},
+ {file = "regex-2023.8.8-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:2162ae2eb8b079622176a81b65d486ba50b888271302190870b8cc488587d280"},
+ {file = "regex-2023.8.8-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:c884d1a59e69e03b93cf0dfee8794c63d7de0ee8f7ffb76e5f75be8131b6400a"},
+ {file = "regex-2023.8.8-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:cf9273e96f3ee2ac89ffcb17627a78f78e7516b08f94dc435844ae72576a276e"},
+ {file = "regex-2023.8.8-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:83215147121e15d5f3a45d99abeed9cf1fe16869d5c233b08c56cdf75f43a504"},
+ {file = "regex-2023.8.8-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:3f7454aa427b8ab9101f3787eb178057c5250478e39b99540cfc2b889c7d0586"},
+ {file = "regex-2023.8.8-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:f0640913d2c1044d97e30d7c41728195fc37e54d190c5385eacb52115127b882"},
+ {file = "regex-2023.8.8-cp38-cp38-win32.whl", hash = "sha256:0c59122ceccb905a941fb23b087b8eafc5290bf983ebcb14d2301febcbe199c7"},
+ {file = "regex-2023.8.8-cp38-cp38-win_amd64.whl", hash = "sha256:c12f6f67495ea05c3d542d119d270007090bad5b843f642d418eb601ec0fa7be"},
+ {file = "regex-2023.8.8-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:82cd0a69cd28f6cc3789cc6adeb1027f79526b1ab50b1f6062bbc3a0ccb2dbc3"},
+ {file = "regex-2023.8.8-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:bb34d1605f96a245fc39790a117ac1bac8de84ab7691637b26ab2c5efb8f228c"},
+ {file = "regex-2023.8.8-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:987b9ac04d0b38ef4f89fbc035e84a7efad9cdd5f1e29024f9289182c8d99e09"},
+ {file = "regex-2023.8.8-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9dd6082f4e2aec9b6a0927202c85bc1b09dcab113f97265127c1dc20e2e32495"},
+ {file = "regex-2023.8.8-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7eb95fe8222932c10d4436e7a6f7c99991e3fdd9f36c949eff16a69246dee2dc"},
+ {file = "regex-2023.8.8-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7098c524ba9f20717a56a8d551d2ed491ea89cbf37e540759ed3b776a4f8d6eb"},
+ {file = "regex-2023.8.8-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4b694430b3f00eb02c594ff5a16db30e054c1b9589a043fe9174584c6efa8033"},
+ {file = "regex-2023.8.8-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:b2aeab3895d778155054abea5238d0eb9a72e9242bd4b43f42fd911ef9a13470"},
+ {file = "regex-2023.8.8-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:988631b9d78b546e284478c2ec15c8a85960e262e247b35ca5eaf7ee22f6050a"},
+ {file = "regex-2023.8.8-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:67ecd894e56a0c6108ec5ab1d8fa8418ec0cff45844a855966b875d1039a2e34"},
+ {file = "regex-2023.8.8-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:14898830f0a0eb67cae2bbbc787c1a7d6e34ecc06fbd39d3af5fe29a4468e2c9"},
+ {file = "regex-2023.8.8-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:f2200e00b62568cfd920127782c61bc1c546062a879cdc741cfcc6976668dfcf"},
+ {file = "regex-2023.8.8-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:9691a549c19c22d26a4f3b948071e93517bdf86e41b81d8c6ac8a964bb71e5a6"},
+ {file = "regex-2023.8.8-cp39-cp39-win32.whl", hash = "sha256:6ab2ed84bf0137927846b37e882745a827458689eb969028af8032b1b3dac78e"},
+ {file = "regex-2023.8.8-cp39-cp39-win_amd64.whl", hash = "sha256:5543c055d8ec7801901e1193a51570643d6a6ab8751b1f7dd9af71af467538bb"},
+ {file = "regex-2023.8.8.tar.gz", hash = "sha256:fcbdc5f2b0f1cd0f6a56cdb46fe41d2cce1e644e3b68832f3eeebc5fb0f7712e"},
]
[[package]]
name = "requests"
version = "2.31.0"
description = "Python HTTP for Humans."
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -5824,7 +5740,6 @@ use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
name = "requests-toolbelt"
version = "1.0.0"
description = "A utility belt for advanced users of python-requests"
-category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
files = [
@@ -5839,7 +5754,6 @@ requests = ">=2.0.1,<3.0.0"
name = "rfc3986"
version = "1.5.0"
description = "Validating URI References per RFC 3986"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -5855,14 +5769,13 @@ idna2008 = ["idna"]
[[package]]
name = "rich"
-version = "13.4.2"
+version = "13.5.2"
description = "Render rich text, tables, progress bars, syntax highlighting, markdown and more to the terminal"
-category = "main"
optional = false
python-versions = ">=3.7.0"
files = [
- {file = "rich-13.4.2-py3-none-any.whl", hash = "sha256:8f87bc7ee54675732fa66a05ebfe489e27264caeeff3728c945d25971b6485ec"},
- {file = "rich-13.4.2.tar.gz", hash = "sha256:d653d6bccede5844304c605d5aac802c7cf9621efd700b46c7ec2b51ea914898"},
+ {file = "rich-13.5.2-py3-none-any.whl", hash = "sha256:146a90b3b6b47cac4a73c12866a499e9817426423f57c5a66949c086191a8808"},
+ {file = "rich-13.5.2.tar.gz", hash = "sha256:fb9d6c0a0f643c99eed3875b5377a184132ba9be4d61516a55273d3554d75a39"},
]
[package.dependencies]
@@ -5876,7 +5789,6 @@ jupyter = ["ipywidgets (>=7.5.1,<9)"]
name = "rsa"
version = "4.9"
description = "Pure-Python RSA implementation"
-category = "main"
optional = false
python-versions = ">=3.6,<4"
files = [
@@ -5891,7 +5803,6 @@ pyasn1 = ">=0.1.3"
name = "ruff"
version = "0.0.254"
description = "An extremely fast Python linter, written in Rust."
-category = "dev"
optional = false
python-versions = ">=3.7"
files = [
@@ -5916,60 +5827,49 @@ files = [
[[package]]
name = "safetensors"
-version = "0.3.1"
+version = "0.3.2"
description = "Fast and Safe Tensor serialization"
-category = "main"
-optional = false
+optional = true
python-versions = "*"
files = [
- {file = "safetensors-0.3.1-cp310-cp310-macosx_10_11_x86_64.whl", hash = "sha256:2ae9b7dd268b4bae6624729dac86deb82104820e9786429b0583e5168db2f770"},
- {file = "safetensors-0.3.1-cp310-cp310-macosx_12_0_arm64.whl", hash = "sha256:08c85c1934682f1e2cd904d38433b53cd2a98245a7cc31f5689f9322a2320bbf"},
- {file = "safetensors-0.3.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ba625c7af9e1c5d0d91cb83d2fba97d29ea69d4db2015d9714d24c7f6d488e15"},
- {file = "safetensors-0.3.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b57d5890c619ec10d9f1b6426b8690d0c9c2868a90dc52f13fae6f6407ac141f"},
- {file = "safetensors-0.3.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5c9f562ea696d50b95cadbeb1716dc476714a87792ffe374280c0835312cbfe2"},
- {file = "safetensors-0.3.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5c115951b3a865ece8d98ee43882f2fd0a999c0200d6e6fec24134715ebe3b57"},
- {file = "safetensors-0.3.1-cp310-cp310-win32.whl", hash = "sha256:118f8f7503ea312fc7af27e934088a1b589fb1eff5a7dea2cd1de6c71ee33391"},
- {file = "safetensors-0.3.1-cp310-cp310-win_amd64.whl", hash = "sha256:54846eaae25fded28a7bebbb66be563cad221b4c80daee39e2f55df5e5e0266f"},
- {file = "safetensors-0.3.1-cp311-cp311-macosx_10_11_universal2.whl", hash = "sha256:5af82e10946c4822506db0f29269f43147e889054704dde994d4e22f0c37377b"},
- {file = "safetensors-0.3.1-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:626c86dd1d930963c8ea7f953a3787ae85322551e3a5203ac731d6e6f3e18f44"},
- {file = "safetensors-0.3.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:12e30677e6af1f4cc4f2832546e91dbb3b0aa7d575bfa473d2899d524e1ace08"},
- {file = "safetensors-0.3.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d534b80bc8d39945bb902f34b0454773971fe9e5e1f2142af451759d7e52b356"},
- {file = "safetensors-0.3.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ddd0ddd502cf219666e7d30f23f196cb87e829439b52b39f3e7da7918c3416df"},
- {file = "safetensors-0.3.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:997a2cc14023713f423e6d16536d55cb16a3d72850f142e05f82f0d4c76d383b"},
- {file = "safetensors-0.3.1-cp311-cp311-win32.whl", hash = "sha256:6ae9ca63d9e22f71ec40550207bd284a60a6b4916ae6ca12c85a8d86bf49e0c3"},
- {file = "safetensors-0.3.1-cp311-cp311-win_amd64.whl", hash = "sha256:62aa7421ca455418423e35029524489480adda53e3f702453580180ecfebe476"},
- {file = "safetensors-0.3.1-cp37-cp37m-macosx_10_11_x86_64.whl", hash = "sha256:6d54b3ed367b6898baab75dfd057c24f36ec64d3938ffff2af981d56bfba2f42"},
- {file = "safetensors-0.3.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:262423aeda91117010f8c607889066028f680fbb667f50cfe6eae96f22f9d150"},
- {file = "safetensors-0.3.1-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:10efe2513a8327fd628cea13167089588acc23093ba132aecfc536eb9a4560fe"},
- {file = "safetensors-0.3.1-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:689b3d6a7ebce70ee9438267ee55ea89b575c19923876645e927d08757b552fe"},
- {file = "safetensors-0.3.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:14cd9a87bc73ce06903e9f8ee8b05b056af6f3c9f37a6bd74997a16ed36ff5f4"},
- {file = "safetensors-0.3.1-cp37-cp37m-win32.whl", hash = "sha256:a77cb39624480d5f143c1cc272184f65a296f573d61629eff5d495d2e0541d3e"},
- {file = "safetensors-0.3.1-cp37-cp37m-win_amd64.whl", hash = "sha256:9eff3190bfbbb52eef729911345c643f875ca4dbb374aa6c559675cfd0ab73db"},
- {file = "safetensors-0.3.1-cp38-cp38-macosx_10_11_x86_64.whl", hash = "sha256:05cbfef76e4daa14796db1bbb52072d4b72a44050c368b2b1f6fd3e610669a89"},
- {file = "safetensors-0.3.1-cp38-cp38-macosx_12_0_arm64.whl", hash = "sha256:c49061461f4a81e5ec3415070a3f135530834c89cbd6a7db7cd49e3cb9d9864b"},
- {file = "safetensors-0.3.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22cf7e73ca42974f098ce0cf4dd8918983700b6b07a4c6827d50c8daefca776e"},
- {file = "safetensors-0.3.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:04f909442d6223ff0016cd2e1b2a95ef8039b92a558014627363a2e267213f62"},
- {file = "safetensors-0.3.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2c573c5a0d5d45791ae8c179e26d74aff86e719056591aa7edb3ca7be55bc961"},
- {file = "safetensors-0.3.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6994043b12e717cf2a6ba69077ac41f0d3675b2819734f07f61819e854c622c7"},
- {file = "safetensors-0.3.1-cp38-cp38-win32.whl", hash = "sha256:158ede81694180a0dbba59422bc304a78c054b305df993c0c6e39c6330fa9348"},
- {file = "safetensors-0.3.1-cp38-cp38-win_amd64.whl", hash = "sha256:afdc725beff7121ea8d39a7339f5a6abcb01daa189ea56290b67fe262d56e20f"},
- {file = "safetensors-0.3.1-cp39-cp39-macosx_10_11_x86_64.whl", hash = "sha256:cba910fcc9e5e64d32d62b837388721165e9c7e45d23bc3a38ad57694b77f40d"},
- {file = "safetensors-0.3.1-cp39-cp39-macosx_12_0_arm64.whl", hash = "sha256:a4f7dbfe7285573cdaddd85ef6fa84ebbed995d3703ab72d71257944e384612f"},
- {file = "safetensors-0.3.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:54aed0802f9eaa83ca7b1cbb986bfb90b8e2c67b6a4bcfe245627e17dad565d4"},
- {file = "safetensors-0.3.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:34b75a766f3cfc99fd4c33e329b76deae63f5f388e455d863a5d6e99472fca8e"},
- {file = "safetensors-0.3.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1a0f31904f35dc14919a145b2d7a2d8842a43a18a629affe678233c4ea90b4af"},
- {file = "safetensors-0.3.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dcf527ecc5f58907fd9031510378105487f318cc91ecdc5aee3c7cc8f46030a8"},
- {file = "safetensors-0.3.1-cp39-cp39-win32.whl", hash = "sha256:e2f083112cf97aa9611e2a05cc170a2795eccec5f6ff837f4565f950670a9d83"},
- {file = "safetensors-0.3.1-cp39-cp39-win_amd64.whl", hash = "sha256:5f4f614b8e8161cd8a9ca19c765d176a82b122fa3d3387b77862145bfe9b4e93"},
- {file = "safetensors-0.3.1.tar.gz", hash = "sha256:571da56ff8d0bec8ae54923b621cda98d36dcef10feb36fd492c4d0c2cd0e869"},
+ {file = "safetensors-0.3.2-cp310-cp310-macosx_11_0_x86_64.whl", hash = "sha256:b6a66989075c2891d743153e8ba9ca84ee7232c8539704488f454199b8b8f84d"},
+ {file = "safetensors-0.3.2-cp310-cp310-macosx_12_0_arm64.whl", hash = "sha256:670d6bc3a3b377278ce2971fa7c36ebc0a35041c4ea23b9df750a39380800195"},
+ {file = "safetensors-0.3.2-cp310-cp310-macosx_13_0_x86_64.whl", hash = "sha256:7f80af7e4ab3188daaff12d43d078da3017a90d732d38d7af4eb08b6ca2198a5"},
+ {file = "safetensors-0.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cbb44e140bf2aeda98d9dde669dbec15f7b77f96a9274469b91a6cf4bcc5ec3b"},
+ {file = "safetensors-0.3.2-cp310-cp310-win32.whl", hash = "sha256:2961c1243fd0da46aa6a1c835305cc4595486f8ac64632a604d0eb5f2de76175"},
+ {file = "safetensors-0.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:c813920482c337d1424d306e1b05824a38e3ef94303748a0a287dea7a8c4f805"},
+ {file = "safetensors-0.3.2-cp311-cp311-macosx_10_11_universal2.whl", hash = "sha256:707df34bd9b9047e97332136ad98e57028faeccdb9cfe1c3b52aba5964cc24bf"},
+ {file = "safetensors-0.3.2-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:becc5bb85b2947eae20ed23b407ebfd5277d9a560f90381fe2c42e6c043677ba"},
+ {file = "safetensors-0.3.2-cp311-cp311-macosx_13_0_universal2.whl", hash = "sha256:54ad6af663e15e2b99e2ea3280981b7514485df72ba6d014dc22dae7ba6a5e6c"},
+ {file = "safetensors-0.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ada0fac127ff8fb04834da5c6d85a8077e6a1c9180a11251d96f8068db922a17"},
+ {file = "safetensors-0.3.2-cp311-cp311-win32.whl", hash = "sha256:155b82dbe2b0ebff18cde3f76b42b6d9470296e92561ef1a282004d449fa2b4c"},
+ {file = "safetensors-0.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:a86428d196959619ce90197731be9391b5098b35100a7228ef4643957648f7f5"},
+ {file = "safetensors-0.3.2-cp37-cp37m-macosx_11_0_x86_64.whl", hash = "sha256:c1f8ab41ed735c5b581f451fd15d9602ff51aa88044bfa933c5fa4b1d0c644d1"},
+ {file = "safetensors-0.3.2-cp37-cp37m-macosx_13_0_x86_64.whl", hash = "sha256:bc9cfb3c9ea2aec89685b4d656f9f2296f0f0d67ecf2bebf950870e3be89b3db"},
+ {file = "safetensors-0.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3d7d70d48585fe8df00725aa788f2e64fd24a4c9ae07cd6be34f6859d0f89a9c"},
+ {file = "safetensors-0.3.2-cp37-cp37m-win32.whl", hash = "sha256:6ff59bc90cdc857f68b1023be9085fda6202bbe7f2fd67d06af8f976d6adcc10"},
+ {file = "safetensors-0.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:8b05c93da15fa911763a89281906ca333ed800ab0ef1c7ce53317aa1a2322f19"},
+ {file = "safetensors-0.3.2-cp38-cp38-macosx_11_0_x86_64.whl", hash = "sha256:8969cfd9e8d904e8d3c67c989e1bd9a95e3cc8980d4f95e4dcd43c299bb94253"},
+ {file = "safetensors-0.3.2-cp38-cp38-macosx_13_0_x86_64.whl", hash = "sha256:f54148ac027556eb02187e9bc1556c4d916c99ca3cb34ca36a7d304d675035c1"},
+ {file = "safetensors-0.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aa98f49e95f02eb750d32c4947e7d5aa43883149ebd0414920866446525b70f0"},
+ {file = "safetensors-0.3.2-cp38-cp38-win32.whl", hash = "sha256:33409df5e28a83dc5cc5547a3ac17c0f1b13a1847b1eb3bc4b3be0df9915171e"},
+ {file = "safetensors-0.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:e04a7cbbb3856159ab99e3adb14521544f65fcb8548cce773a1435a0f8d78d27"},
+ {file = "safetensors-0.3.2-cp39-cp39-macosx_11_0_x86_64.whl", hash = "sha256:7c864cf5dcbfb608c5378f83319c60cc9c97263343b57c02756b7613cd5ab4dd"},
+ {file = "safetensors-0.3.2-cp39-cp39-macosx_12_0_arm64.whl", hash = "sha256:14e8c19d6dc51d4f70ee33c46aff04c8ba3f95812e74daf8036c24bc86e75cae"},
+ {file = "safetensors-0.3.2-cp39-cp39-macosx_13_0_x86_64.whl", hash = "sha256:fafd95e5ef41e8f312e2a32b7031f7b9b2a621b255f867b221f94bb2e9f51ae8"},
+ {file = "safetensors-0.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:87ff0024ef2e5722a79af24688ce4a430f70601d0cf712a744105ed4b8f67ba5"},
+ {file = "safetensors-0.3.2-cp39-cp39-win32.whl", hash = "sha256:827af9478b78977248ba93e2fd97ea307fb63f463f80cef4824460f8c2542a52"},
+ {file = "safetensors-0.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:9b09f27c456efa301f98681ea14b12f81f2637889f6336223ccab71e42c34541"},
+ {file = "safetensors-0.3.2.tar.gz", hash = "sha256:2dbd34554ed3b99435a0e84df077108f5334c8336b5ed9cb8b6b98f7b10da2f6"},
]
[package.extras]
-all = ["black (==22.3)", "click (==8.0.4)", "flake8 (>=3.8.3)", "flax (>=0.6.3)", "h5py (>=3.7.0)", "huggingface-hub (>=0.12.1)", "isort (>=5.5.4)", "jax (>=0.3.25)", "jaxlib (>=0.3.25)", "numpy (>=1.21.6)", "paddlepaddle (>=2.4.1)", "pytest (>=7.2.0)", "pytest-benchmark (>=4.0.0)", "setuptools-rust (>=1.5.2)", "tensorflow (>=2.11.0)", "torch (>=1.10)"]
-dev = ["black (==22.3)", "click (==8.0.4)", "flake8 (>=3.8.3)", "flax (>=0.6.3)", "h5py (>=3.7.0)", "huggingface-hub (>=0.12.1)", "isort (>=5.5.4)", "jax (>=0.3.25)", "jaxlib (>=0.3.25)", "numpy (>=1.21.6)", "paddlepaddle (>=2.4.1)", "pytest (>=7.2.0)", "pytest-benchmark (>=4.0.0)", "setuptools-rust (>=1.5.2)", "tensorflow (>=2.11.0)", "torch (>=1.10)"]
+all = ["black (==22.3)", "click (==8.0.4)", "flake8 (>=3.8.3)", "flax (>=0.6.3)", "h5py (>=3.7.0)", "huggingface-hub (>=0.12.1)", "isort (>=5.5.4)", "jax (>=0.3.25)", "jaxlib (>=0.3.25)", "numpy (>=1.21.6)", "paddlepaddle (>=2.4.1)", "pytest (>=7.2.0)", "pytest-benchmark (>=4.0.0)", "setuptools-rust (>=1.5.2)", "tensorflow (==2.11.0)", "torch (>=1.10)"]
+dev = ["black (==22.3)", "click (==8.0.4)", "flake8 (>=3.8.3)", "flax (>=0.6.3)", "h5py (>=3.7.0)", "huggingface-hub (>=0.12.1)", "isort (>=5.5.4)", "jax (>=0.3.25)", "jaxlib (>=0.3.25)", "numpy (>=1.21.6)", "paddlepaddle (>=2.4.1)", "pytest (>=7.2.0)", "pytest-benchmark (>=4.0.0)", "setuptools-rust (>=1.5.2)", "tensorflow (==2.11.0)", "torch (>=1.10)"]
jax = ["flax (>=0.6.3)", "jax (>=0.3.25)", "jaxlib (>=0.3.25)"]
numpy = ["numpy (>=1.21.6)"]
paddlepaddle = ["paddlepaddle (>=2.4.1)"]
+pinned-tf = ["tensorflow (==2.11.0)"]
quality = ["black (==22.3)", "click (==8.0.4)", "flake8 (>=3.8.3)", "isort (>=5.5.4)"]
tensorflow = ["tensorflow (>=2.11.0)"]
testing = ["h5py (>=3.7.0)", "huggingface-hub (>=0.12.1)", "numpy (>=1.21.6)", "pytest (>=7.2.0)", "pytest-benchmark (>=4.0.0)", "setuptools-rust (>=1.5.2)"]
@@ -5979,8 +5879,7 @@ torch = ["torch (>=1.10)"]
name = "scikit-learn"
version = "1.3.0"
description = "A set of python modules for machine learning and data mining"
-category = "main"
-optional = false
+optional = true
python-versions = ">=3.8"
files = [
{file = "scikit-learn-1.3.0.tar.gz", hash = "sha256:8be549886f5eda46436b6e555b0e4873b4f10aa21c07df45c4bc1735afbccd7a"},
@@ -6022,8 +5921,7 @@ tests = ["black (>=23.3.0)", "matplotlib (>=3.1.3)", "mypy (>=1.3)", "numpydoc (
name = "scipy"
version = "1.11.1"
description = "Fundamental algorithms for scientific computing in Python"
-category = "main"
-optional = false
+optional = true
python-versions = "<3.13,>=3.9"
files = [
{file = "scipy-1.11.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:aec8c62fbe52914f9cf28d846cf0401dd80ab80788bbab909434eb336ed07c04"},
@@ -6059,7 +5957,6 @@ test = ["asv", "gmpy2", "mpmath", "pooch", "pytest", "pytest-cov", "pytest-timeo
name = "secretstorage"
version = "3.3.3"
description = "Python bindings to FreeDesktop.org Secret Service API"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -6075,7 +5972,6 @@ jeepney = ">=0.6"
name = "semver"
version = "2.13.0"
description = "Python helper for Semantic Versioning (http://semver.org/)"
-category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
files = [
@@ -6087,8 +5983,7 @@ files = [
name = "sentence-transformers"
version = "2.2.2"
description = "Multilingual text embeddings"
-category = "main"
-optional = false
+optional = true
python-versions = ">=3.6.0"
files = [
{file = "sentence-transformers-2.2.2.tar.gz", hash = "sha256:dbc60163b27de21076c9a30d24b5b7b6fa05141d68cf2553fa9a77bf79a29136"},
@@ -6110,8 +6005,7 @@ transformers = ">=4.6.0,<5.0.0"
name = "sentencepiece"
version = "0.1.99"
description = "SentencePiece python wrapper"
-category = "main"
-optional = false
+optional = true
python-versions = "*"
files = [
{file = "sentencepiece-0.1.99-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:0eb528e70571b7c02723e5804322469b82fe7ea418c96051d0286c0fa028db73"},
@@ -6165,7 +6059,6 @@ files = [
name = "setuptools"
version = "68.0.0"
description = "Easily download, build, install, upgrade, and uninstall Python packages"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -6182,7 +6075,6 @@ testing-integration = ["build[virtualenv]", "filelock (>=3.4.0)", "jaraco.envs (
name = "shapely"
version = "1.8.5.post1"
description = "Geometric objects, predicates, and operations"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -6239,7 +6131,6 @@ vectorized = ["numpy"]
name = "six"
version = "1.16.0"
description = "Python 2 and 3 compatibility utilities"
-category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*"
files = [
@@ -6251,7 +6142,6 @@ files = [
name = "slack-bolt"
version = "1.18.0"
description = "The Bolt Framework for Python"
-category = "main"
optional = true
python-versions = ">=3.6"
files = [
@@ -6273,7 +6163,6 @@ testing-without-asyncio = ["Flask-Sockets (>=0.2,<1)", "Jinja2 (==3.0.3)", "Werk
name = "slack-sdk"
version = "3.21.3"
description = "The Slack API Platform SDK for Python"
-category = "main"
optional = true
python-versions = ">=3.6.0"
files = [
@@ -6289,7 +6178,6 @@ testing = ["Flask (>=1,<2)", "Flask-Sockets (>=0.2,<1)", "Jinja2 (==3.0.3)", "We
name = "smmap"
version = "5.0.0"
description = "A pure Python implementation of a sliding window memory map manager"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -6301,7 +6189,6 @@ files = [
name = "sniffio"
version = "1.3.0"
description = "Sniff out which async library your code is running under"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -6313,7 +6200,6 @@ files = [
name = "soupsieve"
version = "2.4.1"
description = "A modern CSS selector implementation for Beautiful Soup."
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -6325,7 +6211,6 @@ files = [
name = "sqlalchemy"
version = "1.4.41"
description = "Database Abstraction Library"
-category = "main"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
files = [
@@ -6373,7 +6258,7 @@ files = [
]
[package.dependencies]
-greenlet = {version = "!=0.4.17", markers = "python_version >= \"3\" and platform_machine == \"aarch64\" or python_version >= \"3\" and platform_machine == \"ppc64le\" or python_version >= \"3\" and platform_machine == \"x86_64\" or python_version >= \"3\" and platform_machine == \"amd64\" or python_version >= \"3\" and platform_machine == \"AMD64\" or python_version >= \"3\" and platform_machine == \"win32\" or python_version >= \"3\" and platform_machine == \"WIN32\""}
+greenlet = {version = "!=0.4.17", markers = "python_version >= \"3\" and (platform_machine == \"win32\" or platform_machine == \"WIN32\" or platform_machine == \"AMD64\" or platform_machine == \"amd64\" or platform_machine == \"x86_64\" or platform_machine == \"ppc64le\" or platform_machine == \"aarch64\")"}
[package.extras]
aiomysql = ["aiomysql", "greenlet (!=0.4.17)"]
@@ -6400,7 +6285,6 @@ sqlcipher = ["sqlcipher3-binary"]
name = "sqlalchemy2-stubs"
version = "0.0.2a35"
description = "Typing Stubs for SQLAlchemy 1.4"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -6415,7 +6299,6 @@ typing-extensions = ">=3.7.4"
name = "sqlmodel"
version = "0.0.8"
description = "SQLModel, SQL databases in Python, designed for simplicity, compatibility, and robustness."
-category = "main"
optional = false
python-versions = ">=3.6.1,<4.0.0"
files = [
@@ -6432,7 +6315,6 @@ sqlalchemy2-stubs = "*"
name = "stack-data"
version = "0.6.2"
description = "Extract data from python stack frames and tracebacks for informative displays"
-category = "dev"
optional = false
python-versions = "*"
files = [
@@ -6452,7 +6334,6 @@ tests = ["cython", "littleutils", "pygments", "pytest", "typeguard"]
name = "starlette"
version = "0.27.0"
description = "The little ASGI library that shines."
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -6469,18 +6350,17 @@ full = ["httpx (>=0.22.0)", "itsdangerous", "jinja2", "python-multipart", "pyyam
[[package]]
name = "storage3"
-version = "0.5.2"
+version = "0.5.3"
description = "Supabase Storage client for Python."
-category = "main"
optional = false
python-versions = ">=3.8,<4.0"
files = [
- {file = "storage3-0.5.2-py3-none-any.whl", hash = "sha256:3aaba8cebf89eef6b5fc48739b8c8c8539461f2eed9ea1dc4c763dea10c6d009"},
- {file = "storage3-0.5.2.tar.gz", hash = "sha256:e9932fca869a8f9cdab9a20e5249439928cfe2d07c4524141b15fef1882a7f61"},
+ {file = "storage3-0.5.3-py3-none-any.whl", hash = "sha256:5dab88b8e91afadb72fbfde4ce8fb819d6324385624ceb9dca2927fb80b3b800"},
+ {file = "storage3-0.5.3.tar.gz", hash = "sha256:0c8b356d61eb021d8fcb9ca94d124754f2738c75a73babef91b2f1f60b2a13c0"},
]
[package.dependencies]
-httpx = ">=0.23,<0.24"
+httpx = ">=0.23,<0.25"
python-dateutil = ">=2.8.2,<3.0.0"
typing-extensions = ">=4.2.0,<5.0.0"
@@ -6488,7 +6368,6 @@ typing-extensions = ">=4.2.0,<5.0.0"
name = "strenum"
version = "0.4.15"
description = "An Enum that inherits from str."
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -6505,7 +6384,6 @@ test = ["pylint", "pytest", "pytest-black", "pytest-cov", "pytest-pylint"]
name = "supabase"
version = "1.0.3"
description = "Supabase client for Python."
-category = "main"
optional = false
python-versions = ">=3.8,<4.0"
files = [
@@ -6526,7 +6404,6 @@ supafunc = ">=0.2.2,<0.3.0"
name = "supafunc"
version = "0.2.2"
description = "Library for Supabase Functions"
-category = "main"
optional = false
python-versions = ">=3.7,<4.0"
files = [
@@ -6541,7 +6418,6 @@ httpx = ">=0.23.0,<0.24.0"
name = "sympy"
version = "1.12"
description = "Computer algebra system (CAS) in Python"
-category = "main"
optional = false
python-versions = ">=3.8"
files = [
@@ -6556,7 +6432,6 @@ mpmath = ">=0.19"
name = "tabulate"
version = "0.9.0"
description = "Pretty-print tabular data"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -6571,7 +6446,6 @@ widechars = ["wcwidth"]
name = "tenacity"
version = "8.2.2"
description = "Retry code until it succeeds"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -6584,14 +6458,13 @@ doc = ["reno", "sphinx", "tornado (>=4.5)"]
[[package]]
name = "textual"
-version = "0.30.0"
+version = "0.32.0"
description = "Modern Text User Interface framework"
-category = "main"
optional = true
python-versions = ">=3.7,<4.0"
files = [
- {file = "textual-0.30.0-py3-none-any.whl", hash = "sha256:e87d587e4569236f3809d41955ed9556287dbedaca64724e1d6ad5adbb69c9c5"},
- {file = "textual-0.30.0.tar.gz", hash = "sha256:bf7045a7e9b7dc3ac589c38ce86ac31aecf0e76e8c8ce09aee474316bc2e2c03"},
+ {file = "textual-0.32.0-py3-none-any.whl", hash = "sha256:81fc68406c8806bc864e2f035874a868b4ff0cf466289dce5f7b31869949383b"},
+ {file = "textual-0.32.0.tar.gz", hash = "sha256:f7b6683bc18faee6fd3c47cfbad43fbf8273c5fecc12230d52ce5ee089021327"},
]
[package.dependencies]
@@ -6604,8 +6477,7 @@ typing-extensions = ">=4.4.0,<5.0.0"
name = "threadpoolctl"
version = "3.2.0"
description = "threadpoolctl"
-category = "main"
-optional = false
+optional = true
python-versions = ">=3.8"
files = [
{file = "threadpoolctl-3.2.0-py3-none-any.whl", hash = "sha256:2b7818516e423bdaebb97c723f86a7c6b0a83d3f3b0970328d66f4d9104dc032"},
@@ -6616,7 +6488,6 @@ files = [
name = "tiktoken"
version = "0.4.0"
description = "tiktoken is a fast BPE tokeniser for use with OpenAI's models"
-category = "main"
optional = false
python-versions = ">=3.8"
files = [
@@ -6662,7 +6533,6 @@ blobfile = ["blobfile (>=2)"]
name = "tokenizers"
version = "0.13.3"
description = "Fast and Customizable Tokenizers"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -6717,7 +6587,6 @@ testing = ["black (==22.3)", "datasets", "numpy", "pytest", "requests"]
name = "toml"
version = "0.10.2"
description = "Python Library for Tom's Obvious, Minimal Language"
-category = "main"
optional = true
python-versions = ">=2.6, !=3.0.*, !=3.1.*, !=3.2.*"
files = [
@@ -6729,7 +6598,6 @@ files = [
name = "tomli"
version = "2.0.1"
description = "A lil' TOML parser"
-category = "dev"
optional = false
python-versions = ">=3.7"
files = [
@@ -6739,22 +6607,20 @@ files = [
[[package]]
name = "tomlkit"
-version = "0.11.8"
+version = "0.12.1"
description = "Style preserving TOML library"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "tomlkit-0.11.8-py3-none-any.whl", hash = "sha256:8c726c4c202bdb148667835f68d68780b9a003a9ec34167b6c673b38eff2a171"},
- {file = "tomlkit-0.11.8.tar.gz", hash = "sha256:9330fc7faa1db67b541b28e62018c17d20be733177d290a13b24c62d1614e0c3"},
+ {file = "tomlkit-0.12.1-py3-none-any.whl", hash = "sha256:712cbd236609acc6a3e2e97253dfc52d4c2082982a88f61b640ecf0817eab899"},
+ {file = "tomlkit-0.12.1.tar.gz", hash = "sha256:38e1ff8edb991273ec9f6181244a6a391ac30e9f5098e7535640ea6be97a7c86"},
]
[[package]]
name = "torch"
version = "2.0.1"
description = "Tensors and Dynamic neural networks in Python with strong GPU acceleration"
-category = "main"
-optional = false
+optional = true
python-versions = ">=3.8.0"
files = [
{file = "torch-2.0.1-cp310-cp310-manylinux1_x86_64.whl", hash = "sha256:8ced00b3ba471856b993822508f77c98f48a458623596a4c43136158781e306a"},
@@ -6793,8 +6659,7 @@ opt-einsum = ["opt-einsum (>=3.3)"]
name = "torchvision"
version = "0.15.2"
description = "image and video datasets and models for torch deep learning"
-category = "main"
-optional = false
+optional = true
python-versions = ">=3.8"
files = [
{file = "torchvision-0.15.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:7754088774e810c5672b142a45dcf20b1bd986a5a7da90f8660c43dc43fb850c"},
@@ -6821,7 +6686,7 @@ files = [
[package.dependencies]
numpy = "*"
-pillow = ">=5.3.0,<8.3.0 || >=8.4.0"
+pillow = ">=5.3.0,<8.3.dev0 || >=8.4.dev0"
requests = "*"
torch = "2.0.1"
@@ -6832,7 +6697,6 @@ scipy = ["scipy"]
name = "tornado"
version = "6.3.2"
description = "Tornado is a Python web framework and asynchronous networking library, originally developed at FriendFeed."
-category = "dev"
optional = false
python-versions = ">= 3.8"
files = [
@@ -6851,21 +6715,20 @@ files = [
[[package]]
name = "tqdm"
-version = "4.65.0"
+version = "4.66.1"
description = "Fast, Extensible Progress Meter"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "tqdm-4.65.0-py3-none-any.whl", hash = "sha256:c4f53a17fe37e132815abceec022631be8ffe1b9381c2e6e30aa70edc99e9671"},
- {file = "tqdm-4.65.0.tar.gz", hash = "sha256:1871fb68a86b8fb3b59ca4cdd3dcccbc7e6d613eeed31f4c332531977b89beb5"},
+ {file = "tqdm-4.66.1-py3-none-any.whl", hash = "sha256:d302b3c5b53d47bce91fea46679d9c3c6508cf6332229aa1e7d8653723793386"},
+ {file = "tqdm-4.66.1.tar.gz", hash = "sha256:d88e651f9db8d8551a62556d3cff9e3034274ca5d66e93197cf2490e2dcb69c7"},
]
[package.dependencies]
colorama = {version = "*", markers = "platform_system == \"Windows\""}
[package.extras]
-dev = ["py-make (>=0.1.0)", "twine", "wheel"]
+dev = ["pytest (>=6)", "pytest-cov", "pytest-timeout", "pytest-xdist"]
notebook = ["ipywidgets (>=6)"]
slack = ["slack-sdk"]
telegram = ["requests"]
@@ -6874,7 +6737,6 @@ telegram = ["requests"]
name = "traitlets"
version = "5.9.0"
description = "Traitlets Python configuration system"
-category = "dev"
optional = false
python-versions = ">=3.7"
files = [
@@ -6890,8 +6752,7 @@ test = ["argcomplete (>=2.0)", "pre-commit", "pytest", "pytest-mock"]
name = "transformers"
version = "4.31.0"
description = "State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow"
-category = "main"
-optional = false
+optional = true
python-versions = ">=3.8.0"
files = [
{file = "transformers-4.31.0-py3-none-any.whl", hash = "sha256:8487aab0195ce1c2a5ae189305118b9720daddbc7b688edb09ccd79e3b149f6b"},
@@ -6960,7 +6821,6 @@ vision = ["Pillow (<10.0.0)"]
name = "twine"
version = "3.8.0"
description = "Collection of utilities for publishing packages on PyPI"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -6984,7 +6844,6 @@ urllib3 = ">=1.26.0"
name = "typer"
version = "0.9.0"
description = "Typer, build great CLIs. Easy to code. Based on Python type hints."
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -7006,7 +6865,6 @@ test = ["black (>=22.3.0,<23.0.0)", "coverage (>=6.2,<7.0)", "isort (>=5.0.6,<6.
name = "types-appdirs"
version = "1.4.3.5"
description = "Typing stubs for appdirs"
-category = "dev"
optional = false
python-versions = "*"
files = [
@@ -7018,7 +6876,6 @@ files = [
name = "types-cachetools"
version = "5.3.0.6"
description = "Typing stubs for cachetools"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -7030,7 +6887,6 @@ files = [
name = "types-pillow"
version = "9.5.0.6"
description = "Typing stubs for Pillow"
-category = "dev"
optional = false
python-versions = "*"
files = [
@@ -7042,7 +6898,6 @@ files = [
name = "types-pytz"
version = "2023.3.0.0"
description = "Typing stubs for pytz"
-category = "dev"
optional = false
python-versions = "*"
files = [
@@ -7054,7 +6909,6 @@ files = [
name = "types-pyyaml"
version = "6.0.12.11"
description = "Typing stubs for PyYAML"
-category = "dev"
optional = false
python-versions = "*"
files = [
@@ -7066,7 +6920,6 @@ files = [
name = "types-requests"
version = "2.31.0.2"
description = "Typing stubs for requests"
-category = "dev"
optional = false
python-versions = "*"
files = [
@@ -7081,7 +6934,6 @@ types-urllib3 = "*"
name = "types-urllib3"
version = "1.26.25.14"
description = "Typing stubs for urllib3"
-category = "dev"
optional = false
python-versions = "*"
files = [
@@ -7091,21 +6943,19 @@ files = [
[[package]]
name = "typing-extensions"
-version = "4.5.0"
+version = "4.7.1"
description = "Backported and Experimental Type Hints for Python 3.7+"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "typing_extensions-4.5.0-py3-none-any.whl", hash = "sha256:fb33085c39dd998ac16d1431ebc293a8b3eedd00fd4a32de0ff79002c19511b4"},
- {file = "typing_extensions-4.5.0.tar.gz", hash = "sha256:5cb5f4a79139d699607b3ef622a1dedafa84e115ab0024e0d9c044a9479ca7cb"},
+ {file = "typing_extensions-4.7.1-py3-none-any.whl", hash = "sha256:440d5dd3af93b060174bf433bccd69b0babc3b15b1a8dca43789fd7f61514b36"},
+ {file = "typing_extensions-4.7.1.tar.gz", hash = "sha256:b75ddc264f0ba5615db7ba217daeb99701ad295353c45f9e95963337ceeeffb2"},
]
[[package]]
name = "typing-inspect"
version = "0.9.0"
description = "Runtime inspection utilities for typing module."
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -7121,7 +6971,6 @@ typing-extensions = ">=3.7.4"
name = "tzdata"
version = "2023.3"
description = "Provider of IANA time zone data"
-category = "main"
optional = false
python-versions = ">=2"
files = [
@@ -7133,7 +6982,6 @@ files = [
name = "uc-micro-py"
version = "1.0.2"
description = "Micro subset of unicode data files for linkify-it-py projects."
-category = "main"
optional = true
python-versions = ">=3.7"
files = [
@@ -7148,7 +6996,6 @@ test = ["coverage", "pytest", "pytest-cov"]
name = "unstructured"
version = "0.7.12"
description = "A library that prepares raw documents for downstream ML tasks."
-category = "main"
optional = false
python-versions = ">=3.7.0"
files = [
@@ -7196,7 +7043,6 @@ wikipedia = ["wikipedia"]
name = "uritemplate"
version = "4.1.1"
description = "Implementation of RFC 6570 URI Templates"
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -7208,7 +7054,6 @@ files = [
name = "urllib3"
version = "1.26.16"
description = "HTTP library with thread-safe connection pooling, file post, and more."
-category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*"
files = [
@@ -7225,7 +7070,6 @@ socks = ["PySocks (>=1.5.6,!=1.5.7,<2.0)"]
name = "uvicorn"
version = "0.22.0"
description = "The lightning-fast ASGI server."
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -7240,7 +7084,7 @@ h11 = ">=0.8"
httptools = {version = ">=0.5.0", optional = true, markers = "extra == \"standard\""}
python-dotenv = {version = ">=0.13", optional = true, markers = "extra == \"standard\""}
pyyaml = {version = ">=5.1", optional = true, markers = "extra == \"standard\""}
-uvloop = {version = ">=0.14.0,<0.15.0 || >0.15.0,<0.15.1 || >0.15.1", optional = true, markers = "sys_platform != \"win32\" and sys_platform != \"cygwin\" and platform_python_implementation != \"PyPy\" and extra == \"standard\""}
+uvloop = {version = ">=0.14.0,<0.15.0 || >0.15.0,<0.15.1 || >0.15.1", optional = true, markers = "(sys_platform != \"win32\" and sys_platform != \"cygwin\") and platform_python_implementation != \"PyPy\" and extra == \"standard\""}
watchfiles = {version = ">=0.13", optional = true, markers = "extra == \"standard\""}
websockets = {version = ">=10.4", optional = true, markers = "extra == \"standard\""}
@@ -7251,7 +7095,6 @@ standard = ["colorama (>=0.4)", "httptools (>=0.5.0)", "python-dotenv (>=0.13)",
name = "uvloop"
version = "0.17.0"
description = "Fast implementation of asyncio event loop on top of libuv"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -7296,7 +7139,6 @@ test = ["Cython (>=0.29.32,<0.30.0)", "aiohttp", "flake8 (>=3.9.2,<3.10.0)", "my
name = "validators"
version = "0.20.0"
description = "Python Data Validation for Humansโข."
-category = "main"
optional = false
python-versions = ">=3.4"
files = [
@@ -7313,7 +7155,6 @@ test = ["flake8 (>=2.4.0)", "isort (>=4.2.2)", "pytest (>=2.2.3)"]
name = "watchfiles"
version = "0.19.0"
description = "Simple, modern and high performance file watching and code reload in python."
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -7348,7 +7189,6 @@ anyio = ">=3.0.0"
name = "wcwidth"
version = "0.2.6"
description = "Measures the displayed width of unicode strings in a terminal"
-category = "dev"
optional = false
python-versions = "*"
files = [
@@ -7360,7 +7200,6 @@ files = [
name = "weaviate-client"
version = "3.22.1"
description = "A python native Weaviate client"
-category = "main"
optional = false
python-versions = ">=3.8"
files = [
@@ -7381,7 +7220,6 @@ grpc = ["grpcio", "grpcio-tools"]
name = "webencodings"
version = "0.5.1"
description = "Character encoding aliases for legacy web content"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -7393,7 +7231,6 @@ files = [
name = "websocket-client"
version = "1.6.1"
description = "WebSocket client for Python with low level API options"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -7410,7 +7247,6 @@ test = ["websockets"]
name = "websockets"
version = "10.4"
description = "An implementation of the WebSocket Protocol (RFC 6455 & 7692)"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -7487,14 +7323,13 @@ files = [
[[package]]
name = "wheel"
-version = "0.41.0"
+version = "0.41.1"
description = "A built-package format for Python"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "wheel-0.41.0-py3-none-any.whl", hash = "sha256:7e9be3bbd0078f6147d82ed9ed957e323e7708f57e134743d2edef3a7b7972a9"},
- {file = "wheel-0.41.0.tar.gz", hash = "sha256:55a0f0a5a84869bce5ba775abfd9c462e3a6b1b7b7ec69d72c0b83d673a5114d"},
+ {file = "wheel-0.41.1-py3-none-any.whl", hash = "sha256:473219bd4cbedc62cea0cb309089b593e47c15c4a2531015f94e4e3b9a0f6981"},
+ {file = "wheel-0.41.1.tar.gz", hash = "sha256:12b911f083e876e10c595779709f8a88a59f45aacc646492a67fe9ef796c1b47"},
]
[package.extras]
@@ -7504,7 +7339,6 @@ test = ["pytest (>=6.0.0)", "setuptools (>=65)"]
name = "wikipedia"
version = "1.4.0"
description = "Wikipedia API for Python"
-category = "main"
optional = false
python-versions = "*"
files = [
@@ -7519,7 +7353,6 @@ requests = ">=2.0.0,<3.0.0"
name = "win32-setctime"
version = "1.1.0"
description = "A small Python utility to set file creation time on Windows"
-category = "main"
optional = false
python-versions = ">=3.5"
files = [
@@ -7534,7 +7367,6 @@ dev = ["black (>=19.3b0)", "pytest (>=4.6.2)"]
name = "wrapt"
version = "1.15.0"
description = "Module for decorators, wrappers and monkey patching."
-category = "main"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,>=2.7"
files = [
@@ -7619,7 +7451,6 @@ files = [
name = "xlrd"
version = "2.0.1"
description = "Library for developers to extract data from Microsoft Excel (tm) .xls spreadsheet files"
-category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*"
files = [
@@ -7636,7 +7467,6 @@ test = ["pytest", "pytest-cov"]
name = "xlsxwriter"
version = "3.1.2"
description = "A Python module for creating Excel XLSX files."
-category = "main"
optional = false
python-versions = ">=3.6"
files = [
@@ -7648,7 +7478,6 @@ files = [
name = "yarl"
version = "1.9.2"
description = "Yet another URL library"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -7736,7 +7565,6 @@ multidict = ">=4.0"
name = "zipp"
version = "3.16.2"
description = "Backport of pathlib-compatible object wrapper for zip files"
-category = "main"
optional = false
python-versions = ">=3.8"
files = [
@@ -7752,7 +7580,6 @@ testing = ["big-O", "jaraco.functools", "jaraco.itertools", "more-itertools", "p
name = "zstandard"
version = "0.21.0"
description = "Zstandard bindings for Python"
-category = "main"
optional = false
python-versions = ">=3.7"
files = [
@@ -7808,9 +7635,11 @@ cffi = {version = ">=1.11", markers = "platform_python_implementation == \"PyPy\
cffi = ["cffi (>=1.11)"]
[extras]
+all = []
deploy = ["langchain-serve"]
+local = ["ctransformers", "llama-cpp-python", "sentence-transformers"]
[metadata]
lock-version = "2.0"
python-versions = ">=3.9,<3.11"
-content-hash = "27e5eacffe71bc931e33a29e35b8ad11502457d291d3b2fc718029845a417627"
+content-hash = "9ac584c1cb5292e437fb4026f507a1ee342f7fd19248ef76f7765ec1bcab3211"
diff --git a/pyproject.toml b/pyproject.toml
index dab73887c..7ba79ca1a 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -1,6 +1,6 @@
[tool.poetry]
name = "langflow"
-version = "0.3.3"
+version = "0.4.7"
description = "A Python package with a built-in web application"
authors = ["Logspace "]
maintainers = [
@@ -19,7 +19,7 @@ readme = "README.md"
keywords = ["nlp", "langchain", "openai", "gpt", "gui"]
packages = [{ include = "langflow", from = "src/backend" }]
include = ["src/backend/langflow/*", "src/backend/langflow/**/*"]
-
+documentation = "https://docs.langflow.org"
[tool.poetry.scripts]
langflow = "langflow.__main__:main"
@@ -33,19 +33,19 @@ google-search-results = "^2.4.1"
google-api-python-client = "^2.79.0"
typer = "^0.9.0"
gunicorn = "^21.1.0"
-langchain = "^0.0.240"
+langchain = "^0.0.256"
openai = "^0.27.8"
pandas = "^2.0.0"
chromadb = "^0.3.21"
-huggingface-hub = "^0.15.0"
+huggingface-hub = { version = "^0.16.0", extras = ["inference"] }
rich = "^13.4.2"
-llama-cpp-python = "~0.1.0"
+llama-cpp-python = { version = "~0.1.0", optional = true }
networkx = "^3.1"
unstructured = "^0.7.0"
pypdf = "^3.11.0"
lxml = "^4.9.2"
pysrt = "^1.1.2"
-fake-useragent = "^1.1.3"
+fake-useragent = "^1.2.1"
docstring-parser = "^0.15"
psycopg2-binary = "^2.9.6"
pyarrow = "^12.0.0"
@@ -56,14 +56,14 @@ qdrant-client = "^1.3.0"
websockets = "^10.3"
weaviate-client = "^3.21.0"
jina = "3.15.2"
-sentence-transformers = "^2.2.2"
-ctransformers = "^0.2.10"
+sentence-transformers = { version = "^2.2.2", optional = true }
+ctransformers = { version = "^0.2.10", optional = true }
cohere = "^4.11.0"
python-multipart = "^0.0.6"
sqlmodel = "^0.0.8"
faiss-cpu = "^1.7.4"
anthropic = "^0.3.0"
-orjson = "^3.9.1"
+orjson = "3.9.3"
multiprocess = "^0.70.14"
cachetools = "^5.3.1"
types-cachetools = "^5.3.0.5"
@@ -75,8 +75,12 @@ certifi = "^2023.5.7"
google-cloud-aiplatform = "^1.26.1"
psycopg = "^3.1.9"
psycopg-binary = "^3.1.9"
+fastavro = "^1.8.0"
+langchain-experimental = "^0.0.8"
+alembic = "^1.11.2"
+metaphor-python = "^0.1.11"
-[tool.poetry.dev-dependencies]
+[tool.poetry.group.dev.dependencies]
black = "^23.1.0"
ipykernel = "^6.21.2"
mypy = "^1.1.1"
@@ -94,6 +98,9 @@ types-pyyaml = "^6.0.12.8"
[tool.poetry.extras]
deploy = ["langchain-serve"]
+local = ["llama-cpp-python", "sentence-transformers", "ctransformers"]
+all = ["deploy", "local"]
+
[tool.pytest.ini_options]
minversion = "6.0"
diff --git a/src/backend/Dockerfile b/src/backend/Dockerfile
index 13f2dc62e..8257d89b0 100644
--- a/src/backend/Dockerfile
+++ b/src/backend/Dockerfile
@@ -11,4 +11,4 @@ RUN rm *.whl
EXPOSE 80
-CMD [ "uvicorn", "--host", "0.0.0.0", "--port", "80", "langflow.backend.app:app" ]
+CMD [ "uvicorn", "--host", "0.0.0.0", "--port", "7860", "--factory", "langflow.main:create_app" ]
diff --git a/src/backend/langflow/__init__.py b/src/backend/langflow/__init__.py
index f6d1facdf..f6eb836cc 100644
--- a/src/backend/langflow/__init__.py
+++ b/src/backend/langflow/__init__.py
@@ -1,6 +1,9 @@
from importlib import metadata
-from langflow.cache import cache_manager # noqa: E402
-from langflow.processing.process import load_flow_from_json # noqa: E402
+
+# Deactivate cache manager for now
+# from langflow.services.cache import cache_manager
+from langflow.processing.process import load_flow_from_json
+from langflow.interface.custom.custom_component import CustomComponent
try:
__version__ = metadata.version(__package__)
@@ -9,5 +12,4 @@ except metadata.PackageNotFoundError:
__version__ = ""
del metadata # optional, avoids polluting the results of dir(__package__)
-
-__all__ = ["load_flow_from_json", "cache_manager"]
+__all__ = ["load_flow_from_json", "cache_manager", "CustomComponent"]
diff --git a/src/backend/langflow/__main__.py b/src/backend/langflow/__main__.py
index 385e74932..43247b10f 100644
--- a/src/backend/langflow/__main__.py
+++ b/src/backend/langflow/__main__.py
@@ -1,8 +1,9 @@
-import os
import sys
import time
import httpx
-from multiprocess import Process, cpu_count # type: ignore
+from langflow.services.utils import get_settings_manager
+from langflow.utils.util import get_number_of_workers
+from multiprocess import Process # type: ignore
import platform
from pathlib import Path
from typing import Optional
@@ -12,7 +13,6 @@ from rich import box
from rich import print as rprint
import typer
from langflow.main import setup_app
-from langflow.settings import settings
from langflow.utils.logger import configure, logger
import webbrowser
from dotenv import load_dotenv
@@ -20,52 +20,29 @@ from dotenv import load_dotenv
app = typer.Typer()
-def get_number_of_workers(workers=None):
- if workers == -1:
- workers = (cpu_count() * 2) + 1
- return workers
-
-
def update_settings(
config: str,
cache: str,
dev: bool = False,
- database_url: Optional[str] = None,
remove_api_keys: bool = False,
+ components_path: Optional[Path] = None,
):
"""Update the settings from a config file."""
# Check for database_url in the environment variables
- database_url = database_url or os.getenv("langflow_database_url")
-
+ settings_manager = get_settings_manager()
if config:
- settings.update_from_yaml(config, dev=dev)
- if database_url:
- settings.update_settings(database_url=database_url)
+ logger.debug(f"Loading settings from {config}")
+ settings_manager.settings.update_from_yaml(config, dev=dev)
if remove_api_keys:
- settings.update_settings(remove_api_keys=remove_api_keys)
+ logger.debug(f"Setting remove_api_keys to {remove_api_keys}")
+ settings_manager.settings.update_settings(REMOVE_API_KEYS=remove_api_keys)
if cache:
- settings.update_settings(cache=cache)
-
-
-def load_params():
- """
- Load the parameters from the environment variables.
- """
- global_vars = globals()
-
- for key, value in global_vars.items():
- env_key = f"LANGFLOW_{key.upper()}"
- if env_key in os.environ:
- if isinstance(value, bool):
- # Handle booleans
- global_vars[key] = os.getenv(env_key, str(value)).lower() == "true"
- elif isinstance(value, int):
- # Handle integers
- global_vars[key] = int(os.getenv(env_key, str(value)))
- elif isinstance(value, str) or value is None:
- # Handle strings and None values
- global_vars[key] = os.getenv(env_key, str(value))
+ logger.debug(f"Setting cache to {cache}")
+ settings_manager.settings.update_settings(CACHE=cache)
+ if components_path:
+ logger.debug(f"Adding component path {components_path}")
+ settings_manager.settings.update_settings(COMPONENTS_PATH=components_path)
def serve_on_jcloud():
@@ -120,14 +97,21 @@ def serve(
"127.0.0.1", help="Host to bind the server to.", envvar="LANGFLOW_HOST"
),
workers: int = typer.Option(
- 1, help="Number of worker processes.", envvar="LANGFLOW_WORKERS"
+ 2, help="Number of worker processes.", envvar="LANGFLOW_WORKERS"
),
- timeout: int = typer.Option(60, help="Worker timeout in seconds."),
+ timeout: int = typer.Option(300, help="Worker timeout in seconds."),
port: int = typer.Option(7860, help="Port to listen on.", envvar="LANGFLOW_PORT"),
- config: str = typer.Option("config.yaml", help="Path to the configuration file."),
+ components_path: Optional[Path] = typer.Option(
+ Path(__file__).parent / "components",
+ help="Path to the directory containing custom components.",
+ envvar="LANGFLOW_COMPONENTS_PATH",
+ ),
+ config: str = typer.Option(
+ Path(__file__).parent / "config.yaml", help="Path to the configuration file."
+ ),
# .env file param
env_file: Path = typer.Option(
- ".env", help="Path to the .env file containing environment variables."
+ None, help="Path to the .env file containing environment variables."
),
log_level: str = typer.Option(
"critical", help="Logging level.", envvar="LANGFLOW_LOG_LEVEL"
@@ -142,11 +126,13 @@ def serve(
),
jcloud: bool = typer.Option(False, help="Deploy on Jina AI Cloud"),
dev: bool = typer.Option(False, help="Run in development mode (may contain bugs)"),
- database_url: str = typer.Option(
- None,
- help="Database URL to connect to. If not provided, a local SQLite database will be used.",
- envvar="LANGFLOW_DATABASE_URL",
- ),
+ # This variable does not work but is set by the .env file
+ # and works with Pydantic
+ # database_url: str = typer.Option(
+ # None,
+ # help="Database URL to connect to. If not provided, a local SQLite database will be used.",
+ # envvar="LANGFLOW_DATABASE_URL",
+ # ),
path: str = typer.Option(
None,
help="Path to the frontend directory containing build files. This is for development purposes only.",
@@ -162,6 +148,11 @@ def serve(
help="Remove API keys from the projects saved in the database.",
envvar="LANGFLOW_REMOVE_API_KEYS",
),
+ backend_only: bool = typer.Option(
+ False,
+ help="Run only the backend server without the frontend.",
+ envvar="LANGFLOW_BACKEND_ONLY",
+ ),
):
"""
Run the Langflow server.
@@ -169,7 +160,6 @@ def serve(
# override env variables with .env file
if env_file:
load_dotenv(env_file, override=True)
- load_params()
if jcloud:
return serve_on_jcloud()
@@ -178,13 +168,13 @@ def serve(
update_settings(
config,
dev=dev,
- database_url=database_url,
remove_api_keys=remove_api_keys,
cache=cache,
+ components_path=components_path,
)
# create path object if path is provided
static_files_dir: Optional[Path] = Path(path) if path else None
- app = setup_app(static_files_dir=static_files_dir)
+ app = setup_app(static_files_dir=static_files_dir, backend_only=backend_only)
# check if port is being used
if is_port_in_use(port, host):
port = get_free_port(port)
@@ -196,6 +186,10 @@ def serve(
"timeout": timeout,
}
+ # Define an env variable to know if we are just testing the server
+ if "pytest" in sys.modules:
+ return
+
if platform.system() in ["Windows"]:
# Run using uvicorn on MacOS and Windows
# Windows doesn't support gunicorn
@@ -298,7 +292,7 @@ def run_langflow(host, port, log_level, options, app):
Run Langflow server on localhost
"""
try:
- if platform.system() in ["Darwin", "Windows"]:
+ if platform.system() in ["Windows"]:
# Run using uvicorn on MacOS and Windows
# Windows doesn't support gunicorn
# MacOS requires an env variable to be set to use gunicorn
diff --git a/src/backend/langflow/alembic.ini b/src/backend/langflow/alembic.ini
new file mode 100644
index 000000000..379661422
--- /dev/null
+++ b/src/backend/langflow/alembic.ini
@@ -0,0 +1,113 @@
+# A generic, single database configuration.
+
+[alembic]
+# path to migration scripts
+script_location = alembic
+
+# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
+# Uncomment the line below if you want the files to be prepended with date and time
+# see https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file
+# for all available tokens
+# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s
+
+# sys.path path, will be prepended to sys.path if present.
+# defaults to the current working directory.
+prepend_sys_path = .
+
+# timezone to use when rendering the date within the migration file
+# as well as the filename.
+# If specified, requires the python-dateutil library that can be
+# installed by adding `alembic[tz]` to the pip requirements
+# string value is passed to dateutil.tz.gettz()
+# leave blank for localtime
+# timezone =
+
+# max length of characters to apply to the
+# "slug" field
+# truncate_slug_length = 40
+
+# set to 'true' to run the environment during
+# the 'revision' command, regardless of autogenerate
+# revision_environment = false
+
+# set to 'true' to allow .pyc and .pyo files without
+# a source .py file to be detected as revisions in the
+# versions/ directory
+# sourceless = false
+
+# version location specification; This defaults
+# to alembic/versions. When using multiple version
+# directories, initial revisions must be specified with --version-path.
+# The path separator used here should be the separator specified by "version_path_separator" below.
+# version_locations = %(here)s/bar:%(here)s/bat:alembic/versions
+
+# version path separator; As mentioned above, this is the character used to split
+# version_locations. The default within new alembic.ini files is "os", which uses os.pathsep.
+# If this key is omitted entirely, it falls back to the legacy behavior of splitting on spaces and/or commas.
+# Valid values for version_path_separator are:
+#
+# version_path_separator = :
+# version_path_separator = ;
+# version_path_separator = space
+version_path_separator = os # Use os.pathsep. Default configuration used for new projects.
+
+# set to 'true' to search source files recursively
+# in each "version_locations" directory
+# new in Alembic version 1.10
+# recursive_version_locations = false
+
+# the output encoding used when revision files
+# are written from script.py.mako
+# output_encoding = utf-8
+
+# This is the path to the db in the root of the project.
+# When the user runs the Langflow the database url will
+# be set dinamically.
+sqlalchemy.url = sqlite:///../../../langflow.db
+
+
+[post_write_hooks]
+# post_write_hooks defines scripts or Python functions that are run
+# on newly generated revision scripts. See the documentation for further
+# detail and examples
+
+# format using "black" - use the console_scripts runner, against the "black" entrypoint
+# hooks = black
+# black.type = console_scripts
+# black.entrypoint = black
+# black.options = -l 79 REVISION_SCRIPT_FILENAME
+
+# Logging configuration
+[loggers]
+keys = root,sqlalchemy,alembic
+
+[handlers]
+keys = console
+
+[formatters]
+keys = generic
+
+[logger_root]
+level = WARN
+handlers = console
+qualname =
+
+[logger_sqlalchemy]
+level = WARN
+handlers =
+qualname = sqlalchemy.engine
+
+[logger_alembic]
+level = INFO
+handlers =
+qualname = alembic
+
+[handler_console]
+class = StreamHandler
+args = (sys.stderr,)
+level = NOTSET
+formatter = generic
+
+[formatter_generic]
+format = %(levelname)-5.5s [%(name)s] %(message)s
+datefmt = %H:%M:%S
diff --git a/src/backend/langflow/alembic/README b/src/backend/langflow/alembic/README
new file mode 100644
index 000000000..98e4f9c44
--- /dev/null
+++ b/src/backend/langflow/alembic/README
@@ -0,0 +1 @@
+Generic single-database configuration.
\ No newline at end of file
diff --git a/src/backend/langflow/alembic/env.py b/src/backend/langflow/alembic/env.py
new file mode 100644
index 000000000..310894431
--- /dev/null
+++ b/src/backend/langflow/alembic/env.py
@@ -0,0 +1,78 @@
+from logging.config import fileConfig
+
+from sqlalchemy import engine_from_config
+from sqlalchemy import pool
+
+from alembic import context
+
+from langflow.services.database.manager import SQLModel
+
+# this is the Alembic Config object, which provides
+# access to the values within the .ini file in use.
+config = context.config
+
+# Interpret the config file for Python logging.
+# This line sets up loggers basically.
+if config.config_file_name is not None:
+ fileConfig(config.config_file_name)
+
+# add your model's MetaData object here
+# for 'autogenerate' support
+# from myapp import mymodel
+# target_metadata = mymodel.Base.metadata
+target_metadata = SQLModel.metadata
+
+# other values from the config, defined by the needs of env.py,
+# can be acquired:
+# my_important_option = config.get_main_option("my_important_option")
+# ... etc.
+
+
+def run_migrations_offline() -> None:
+ """Run migrations in 'offline' mode.
+
+ This configures the context with just a URL
+ and not an Engine, though an Engine is acceptable
+ here as well. By skipping the Engine creation
+ we don't even need a DBAPI to be available.
+
+ Calls to context.execute() here emit the given string to the
+ script output.
+
+ """
+ url = config.get_main_option("sqlalchemy.url")
+ context.configure(
+ url=url,
+ target_metadata=target_metadata,
+ literal_binds=True,
+ dialect_opts={"paramstyle": "named"},
+ )
+
+ with context.begin_transaction():
+ context.run_migrations()
+
+
+def run_migrations_online() -> None:
+ """Run migrations in 'online' mode.
+
+ In this scenario we need to create an Engine
+ and associate a connection with the context.
+
+ """
+ connectable = engine_from_config(
+ config.get_section(config.config_ini_section, {}),
+ prefix="sqlalchemy.",
+ poolclass=pool.NullPool,
+ )
+
+ with connectable.connect() as connection:
+ context.configure(connection=connection, target_metadata=target_metadata)
+
+ with context.begin_transaction():
+ context.run_migrations()
+
+
+if context.is_offline_mode():
+ run_migrations_offline()
+else:
+ run_migrations_online()
diff --git a/src/backend/langflow/alembic/script.py.mako b/src/backend/langflow/alembic/script.py.mako
new file mode 100644
index 000000000..6ce335109
--- /dev/null
+++ b/src/backend/langflow/alembic/script.py.mako
@@ -0,0 +1,27 @@
+"""${message}
+
+Revision ID: ${up_revision}
+Revises: ${down_revision | comma,n}
+Create Date: ${create_date}
+
+"""
+from typing import Sequence, Union
+
+from alembic import op
+import sqlalchemy as sa
+import sqlmodel
+${imports if imports else ""}
+
+# revision identifiers, used by Alembic.
+revision: str = ${repr(up_revision)}
+down_revision: Union[str, None] = ${repr(down_revision)}
+branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
+depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
+
+
+def upgrade() -> None:
+ ${upgrades if upgrades else "pass"}
+
+
+def downgrade() -> None:
+ ${downgrades if downgrades else "pass"}
diff --git a/src/backend/langflow/alembic/versions/0a534bdfd84b_remove_flowstyles_table.py b/src/backend/langflow/alembic/versions/0a534bdfd84b_remove_flowstyles_table.py
new file mode 100644
index 000000000..0100df44d
--- /dev/null
+++ b/src/backend/langflow/alembic/versions/0a534bdfd84b_remove_flowstyles_table.py
@@ -0,0 +1,42 @@
+"""Remove FlowStyles table
+
+Revision ID: 0a534bdfd84b
+Revises: 4814b6f4abfd
+Create Date: 2023-08-07 14:09:06.844104
+
+"""
+from typing import Sequence, Union
+
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision: str = "0a534bdfd84b"
+down_revision: Union[str, None] = "4814b6f4abfd"
+branch_labels: Union[str, Sequence[str], None] = None
+depends_on: Union[str, Sequence[str], None] = None
+
+
+def upgrade() -> None:
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_table("flowstyle")
+ # ### end Alembic commands ###
+
+
+def downgrade() -> None:
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "flowstyle",
+ sa.Column("color", sa.VARCHAR(), nullable=False),
+ sa.Column("emoji", sa.VARCHAR(), nullable=False),
+ sa.Column("flow_id", sa.CHAR(length=32), nullable=True),
+ sa.Column("id", sa.CHAR(length=32), nullable=False),
+ sa.ForeignKeyConstraint(
+ ["flow_id"],
+ ["flow.id"],
+ ),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("id"),
+ )
+ # ### end Alembic commands ###
diff --git a/src/backend/langflow/alembic/versions/4814b6f4abfd_add_flow_table.py b/src/backend/langflow/alembic/versions/4814b6f4abfd_add_flow_table.py
new file mode 100644
index 000000000..0b2f32657
--- /dev/null
+++ b/src/backend/langflow/alembic/versions/4814b6f4abfd_add_flow_table.py
@@ -0,0 +1,65 @@
+"""Add Flow table
+
+Revision ID: 4814b6f4abfd
+Revises:
+Create Date: 2023-08-05 17:47:42.879824
+
+"""
+
+import contextlib
+from typing import Sequence, Union
+
+from alembic import op
+import sqlalchemy as sa
+import sqlmodel
+
+
+# revision identifiers, used by Alembic.
+revision: str = "4814b6f4abfd"
+down_revision: Union[str, None] = None
+branch_labels: Union[str, Sequence[str], None] = None
+depends_on: Union[str, Sequence[str], None] = None
+
+
+def upgrade() -> None:
+ # ### commands auto generated by Alembic - please adjust! ###
+
+ # This suppress is used to not break the migration if the table already exists.
+ with contextlib.suppress(sa.exc.OperationalError):
+ op.create_table(
+ "flow",
+ sa.Column("data", sa.JSON(), nullable=True),
+ sa.Column("name", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
+ sa.Column("description", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
+ sa.Column("id", sqlmodel.sql.sqltypes.GUID(), nullable=False),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("id"),
+ )
+ op.create_index(
+ op.f("ix_flow_description"), "flow", ["description"], unique=False
+ )
+ op.create_index(op.f("ix_flow_name"), "flow", ["name"], unique=False)
+ with contextlib.suppress(sa.exc.OperationalError):
+ op.create_table(
+ "flowstyle",
+ sa.Column("color", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
+ sa.Column("emoji", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
+ sa.Column("flow_id", sqlmodel.sql.sqltypes.GUID(), nullable=True),
+ sa.Column("id", sqlmodel.sql.sqltypes.GUID(), nullable=False),
+ sa.ForeignKeyConstraint(
+ ["flow_id"],
+ ["flow.id"],
+ ),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("id"),
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade() -> None:
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_table("flowstyle")
+ op.drop_index(op.f("ix_flow_name"), table_name="flow")
+ op.drop_index(op.f("ix_flow_description"), table_name="flow")
+ op.drop_table("flow")
+ # ### end Alembic commands ###
diff --git a/src/backend/langflow/api/router.py b/src/backend/langflow/api/router.py
index f090abe74..ea1938a75 100644
--- a/src/backend/langflow/api/router.py
+++ b/src/backend/langflow/api/router.py
@@ -5,7 +5,7 @@ from langflow.api.v1 import (
endpoints_router,
validate_router,
flows_router,
- flow_styles_router,
+ component_router,
)
router = APIRouter(
@@ -14,5 +14,5 @@ router = APIRouter(
router.include_router(chat_router)
router.include_router(endpoints_router)
router.include_router(validate_router)
+router.include_router(component_router)
router.include_router(flows_router)
-router.include_router(flow_styles_router)
diff --git a/src/backend/langflow/api/utils.py b/src/backend/langflow/api/utils.py
index 2384a4089..0fb53e541 100644
--- a/src/backend/langflow/api/utils.py
+++ b/src/backend/langflow/api/utils.py
@@ -57,3 +57,39 @@ def build_input_keys_response(langchain_object, artifacts):
input_keys_response["template"] = langchain_object.prompt.template
return input_keys_response
+
+
+def merge_nested_dicts(dict1, dict2):
+ for key, value in dict2.items():
+ if isinstance(value, dict) and isinstance(dict1.get(key), dict):
+ dict1[key] = merge_nested_dicts(dict1[key], value)
+ else:
+ dict1[key] = value
+ return dict1
+
+
+def merge_nested_dicts_with_renaming(dict1, dict2):
+ for key, value in dict2.items():
+ if (
+ key in dict1
+ and isinstance(value, dict)
+ and isinstance(dict1.get(key), dict)
+ ):
+ for sub_key, sub_value in value.items():
+ if sub_key in dict1[key]:
+ new_key = get_new_key(dict1[key], sub_key)
+ dict1[key][new_key] = sub_value
+ else:
+ dict1[key][sub_key] = sub_value
+ else:
+ dict1[key] = value
+ return dict1
+
+
+def get_new_key(dictionary, original_key):
+ counter = 1
+ new_key = original_key + " (" + str(counter) + ")"
+ while new_key in dictionary:
+ counter += 1
+ new_key = original_key + " (" + str(counter) + ")"
+ return new_key
diff --git a/src/backend/langflow/api/v1/__init__.py b/src/backend/langflow/api/v1/__init__.py
index f18f90e42..b6e7b36d8 100644
--- a/src/backend/langflow/api/v1/__init__.py
+++ b/src/backend/langflow/api/v1/__init__.py
@@ -2,12 +2,12 @@ from langflow.api.v1.endpoints import router as endpoints_router
from langflow.api.v1.validate import router as validate_router
from langflow.api.v1.chat import router as chat_router
from langflow.api.v1.flows import router as flows_router
-from langflow.api.v1.flow_styles import router as flow_styles_router
+from langflow.api.v1.components import router as component_router
__all__ = [
"chat_router",
"endpoints_router",
+ "component_router",
"validate_router",
"flows_router",
- "flow_styles_router",
]
diff --git a/src/backend/langflow/api/v1/callback.py b/src/backend/langflow/api/v1/callback.py
index deddde47f..69dbf5082 100644
--- a/src/backend/langflow/api/v1/callback.py
+++ b/src/backend/langflow/api/v1/callback.py
@@ -91,8 +91,8 @@ class AsyncStreamingLLMCallbackHandler(AsyncCallbackHandler):
# This is to emulate the stream of tokens
for resp in resps:
await self.websocket.send_json(resp.dict())
- except Exception as e:
- logger.error(e)
+ except Exception as exc:
+ logger.error(f"Error sending response: {exc}")
async def on_tool_error(
self, error: Union[Exception, KeyboardInterrupt], **kwargs: Any
diff --git a/src/backend/langflow/api/v1/chat.py b/src/backend/langflow/api/v1/chat.py
index 43f10a54b..611407e8d 100644
--- a/src/backend/langflow/api/v1/chat.py
+++ b/src/backend/langflow/api/v1/chat.py
@@ -3,13 +3,13 @@ from fastapi.responses import StreamingResponse
from langflow.api.utils import build_input_keys_response
from langflow.api.v1.schemas import BuildStatus, BuiltResponse, InitResponse, StreamData
-from langflow.chat.manager import ChatManager
+from langflow.services import service_manager, ServiceType
from langflow.graph.graph.base import Graph
from langflow.utils.logger import logger
from cachetools import LRUCache
router = APIRouter(tags=["Chat"])
-chat_manager = ChatManager()
+
flow_data_store: LRUCache = LRUCache(maxsize=10)
@@ -17,6 +17,7 @@ flow_data_store: LRUCache = LRUCache(maxsize=10)
async def chat(client_id: str, websocket: WebSocket):
"""Websocket endpoint for chat."""
try:
+ chat_manager = service_manager.get(ServiceType.CHAT_MANAGER)
if client_id in chat_manager.in_memory_cache:
await chat_manager.handle_websocket(client_id, websocket)
else:
@@ -26,7 +27,7 @@ async def chat(client_id: str, websocket: WebSocket):
message = "Please, build the flow before sending messages"
await websocket.close(code=status.WS_1011_INTERNAL_ERROR, reason=message)
except WebSocketException as exc:
- logger.error(exc)
+ logger.error(f"Websocket error: {exc}")
await websocket.close(code=status.WS_1011_INTERNAL_ERROR, reason=str(exc))
@@ -45,6 +46,7 @@ async def init_build(graph_data: dict, flow_id: str):
return InitResponse(flowId=flow_id)
# Delete from cache if already exists
+ chat_manager = service_manager.get(ServiceType.CHAT_MANAGER)
if flow_id in chat_manager.in_memory_cache:
with chat_manager.in_memory_cache._lock:
chat_manager.in_memory_cache.delete(flow_id)
@@ -56,7 +58,7 @@ async def init_build(graph_data: dict, flow_id: str):
return InitResponse(flowId=flow_id)
except Exception as exc:
- logger.error(exc)
+ logger.error(f"Error initializing build: {exc}")
return HTTPException(status_code=500, detail=str(exc))
@@ -74,7 +76,7 @@ async def build_status(flow_id: str):
)
except Exception as exc:
- logger.error(exc)
+ logger.error(f"Error checking build status: {exc}")
return HTTPException(status_code=500, detail=str(exc))
@@ -125,9 +127,8 @@ async def stream_build(flow_id: str):
vertex.build()
params = vertex._built_object_repr()
valid = True
- logger.debug(
- f"Building node {str(params)[:50]}{'...' if len(str(params)) > 50 else ''}"
- )
+ logger.debug(f"Building node {str(vertex.vertex_type)}")
+ logger.debug(f"Output: {params}")
if vertex.artifacts:
# The artifacts will be prompt variables
# passed to build_input_keys_response
@@ -156,12 +157,12 @@ async def stream_build(flow_id: str):
)
else:
input_keys_response = {
- "input_keys": {},
+ "input_keys": None,
"memory_keys": [],
"handle_keys": [],
}
yield str(StreamData(event="message", data=input_keys_response))
-
+ chat_manager = service_manager.get(ServiceType.CHAT_MANAGER)
chat_manager.set_cache(flow_id, langchain_object)
# We need to reset the chat history
chat_manager.chat_history.empty_history(flow_id)
@@ -177,5 +178,5 @@ async def stream_build(flow_id: str):
try:
return StreamingResponse(event_stream(flow_id), media_type="text/event-stream")
except Exception as exc:
- logger.error(exc)
+ logger.error(f"Error streaming build: {exc}")
raise HTTPException(status_code=500, detail=str(exc))
diff --git a/src/backend/langflow/api/v1/components.py b/src/backend/langflow/api/v1/components.py
new file mode 100644
index 000000000..4071461fb
--- /dev/null
+++ b/src/backend/langflow/api/v1/components.py
@@ -0,0 +1,77 @@
+from datetime import timezone
+from typing import List
+from uuid import UUID
+from langflow.services.database.models.component import Component, ComponentModel
+from langflow.services.utils import get_session
+from sqlmodel import Session, select
+from fastapi import APIRouter, Depends, HTTPException
+from sqlalchemy.exc import IntegrityError
+from datetime import datetime
+
+
+COMPONENT_NOT_FOUND = "Component not found"
+COMPONENT_ALREADY_EXISTS = "A component with the same id already exists."
+COMPONENT_DELETED = "Component deleted"
+
+
+router = APIRouter(prefix="/components", tags=["Components"])
+
+
+@router.post("/", response_model=Component)
+def create_component(component: ComponentModel, db: Session = Depends(get_session)):
+ db_component = Component(**component.dict())
+ try:
+ db.add(db_component)
+ db.commit()
+ db.refresh(db_component)
+ except IntegrityError as e:
+ db.rollback()
+ raise HTTPException(
+ status_code=400,
+ detail=COMPONENT_ALREADY_EXISTS,
+ ) from e
+ return db_component
+
+
+@router.get("/{component_id}", response_model=Component)
+def read_component(component_id: UUID, db: Session = Depends(get_session)):
+ if component := db.get(Component, component_id):
+ return component
+ else:
+ raise HTTPException(status_code=404, detail=COMPONENT_NOT_FOUND)
+
+
+@router.get("/", response_model=List[Component])
+def read_components(skip: int = 0, limit: int = 50, db: Session = Depends(get_session)):
+ query = select(Component)
+ query = query.offset(skip).limit(limit)
+
+ return db.execute(query).fetchall()
+
+
+@router.patch("/{component_id}", response_model=Component)
+def update_component(
+ component_id: UUID, component: ComponentModel, db: Session = Depends(get_session)
+):
+ db_component = db.get(Component, component_id)
+ if not db_component:
+ raise HTTPException(status_code=404, detail=COMPONENT_NOT_FOUND)
+ component_data = component.dict(exclude_unset=True)
+
+ for key, value in component_data.items():
+ setattr(db_component, key, value)
+
+ db_component.update_at = datetime.now(timezone.utc)
+ db.commit()
+ db.refresh(db_component)
+ return db_component
+
+
+@router.delete("/{component_id}")
+def delete_component(component_id: UUID, db: Session = Depends(get_session)):
+ component = db.get(Component, component_id)
+ if not component:
+ raise HTTPException(status_code=404, detail=COMPONENT_NOT_FOUND)
+ db.delete(component)
+ db.commit()
+ return {"detail": COMPONENT_DELETED}
diff --git a/src/backend/langflow/api/v1/endpoints.py b/src/backend/langflow/api/v1/endpoints.py
index 13cba6c2c..5d0c9a900 100644
--- a/src/backend/langflow/api/v1/endpoints.py
+++ b/src/backend/langflow/api/v1/endpoints.py
@@ -1,18 +1,31 @@
-from typing import Optional
-from langflow.cache.utils import save_uploaded_file
-from langflow.database.models.flow import Flow
-from langflow.processing.process import process_graph_cached, process_tweaks
-from langflow.utils.logger import logger
+from http import HTTPStatus
+from typing import Annotated, Optional
+
+from langflow.services.cache.utils import save_uploaded_file
+from langflow.services.database.models.flow import Flow
+from langflow.processing.process import process_graph_cached, process_tweaks
+from langflow.services.utils import get_settings_manager
+from langflow.utils.logger import logger
+from fastapi import APIRouter, Depends, HTTPException, UploadFile, Body
+
+from langflow.interface.custom.custom_component import CustomComponent
-from fastapi import APIRouter, Depends, HTTPException, UploadFile
from langflow.api.v1.schemas import (
ProcessResponse,
UploadFileResponse,
+ CustomComponentCode,
)
-from langflow.interface.types import langchain_types_dict
-from langflow.database.base import get_session
+from langflow.api.utils import merge_nested_dicts_with_renaming
+
+from langflow.interface.types import (
+ build_langchain_types_dict,
+ build_langchain_template_custom_component,
+ build_langchain_custom_component_list_from_path,
+)
+
+from langflow.services.utils import get_session
from sqlmodel import Session
# build router
@@ -21,7 +34,37 @@ router = APIRouter(tags=["Base"])
@router.get("/all")
def get_all():
- return langchain_types_dict
+ logger.debug("Building langchain types dict")
+ native_components = build_langchain_types_dict()
+ # custom_components is a list of dicts
+ # need to merge all the keys into one dict
+ custom_components_from_file = {}
+ settings_manager = get_settings_manager()
+ if settings_manager.settings.COMPONENTS_PATH:
+ logger.info(
+ f"Building custom components from {settings_manager.settings.COMPONENTS_PATH}"
+ )
+ custom_component_dicts = [
+ build_langchain_custom_component_list_from_path(str(path))
+ for path in settings_manager.settings.COMPONENTS_PATH
+ ]
+ logger.info(f"Loading {len(custom_component_dicts)} category(ies)")
+ for custom_component_dict in custom_component_dicts:
+ # custom_component_dict is a dict of dicts
+ if not custom_component_dict:
+ continue
+ category = list(custom_component_dict.keys())[0]
+ logger.info(
+ f"Loading {len(custom_component_dict[category])} component(s) from category {category}"
+ )
+ logger.debug(custom_component_dict)
+ custom_components_from_file = merge_nested_dicts_with_renaming(
+ custom_components_from_file, custom_component_dict
+ )
+
+ return merge_nested_dicts_with_renaming(
+ native_components, custom_components_from_file
+ )
# For backwards compatibility we will keep the old endpoint
@@ -31,6 +74,7 @@ async def process_flow(
flow_id: str,
inputs: Optional[dict] = None,
tweaks: Optional[dict] = None,
+ clear_cache: Annotated[bool, Body(embed=True)] = False, # noqa: F821
session: Session = Depends(get_session),
):
"""
@@ -50,7 +94,7 @@ async def process_flow(
graph_data = process_tweaks(graph_data, tweaks)
except Exception as exc:
logger.error(f"Error processing tweaks: {exc}")
- response = process_graph_cached(graph_data, inputs)
+ response = process_graph_cached(graph_data, inputs, clear_cache)
return ProcessResponse(
result=response,
)
@@ -60,7 +104,11 @@ async def process_flow(
raise HTTPException(status_code=500, detail=str(e)) from e
-@router.post("/upload/{flow_id}", response_model=UploadFileResponse, status_code=201)
+@router.post(
+ "/upload/{flow_id}",
+ response_model=UploadFileResponse,
+ status_code=HTTPStatus.CREATED,
+)
async def create_upload_file(file: UploadFile, flow_id: str):
# Cache file
try:
@@ -81,3 +129,13 @@ def get_version():
from langflow import __version__
return {"version": __version__}
+
+
+@router.post("/custom_component", status_code=HTTPStatus.OK)
+async def custom_component(
+ raw_code: CustomComponentCode,
+):
+ extractor = CustomComponent(code=raw_code.code)
+ extractor.is_check_valid()
+
+ return build_langchain_template_custom_component(extractor)
diff --git a/src/backend/langflow/api/v1/flow_styles.py b/src/backend/langflow/api/v1/flow_styles.py
deleted file mode 100644
index 40e292eb3..000000000
--- a/src/backend/langflow/api/v1/flow_styles.py
+++ /dev/null
@@ -1,83 +0,0 @@
-from uuid import UUID
-from langflow.database.models.flow_style import (
- FlowStyle,
- FlowStyleCreate,
- FlowStyleRead,
- FlowStyleUpdate,
-)
-from langflow.database.base import get_session
-from sqlmodel import Session, select
-from fastapi import APIRouter, Depends, HTTPException
-
-
-# build router
-router = APIRouter(prefix="/flow_styles", tags=["FlowStyles"])
-
-# FlowStyleCreate:
-# class FlowStyleBase(SQLModel):
-# color: str = Field(index=True)
-# emoji: str = Field(index=False)
-# flow_id: UUID = Field(default=None, foreign_key="flow.id")
-
-
-@router.post("/", response_model=FlowStyleRead)
-def create_flow_style(
- *, session: Session = Depends(get_session), flow_style: FlowStyleCreate
-):
- """Create a new flow_style."""
- db_flow_style = FlowStyle.from_orm(flow_style)
- session.add(db_flow_style)
- session.commit()
- session.refresh(db_flow_style)
- return db_flow_style
-
-
-@router.get("/", response_model=list[FlowStyleRead])
-def read_flow_styles(*, session: Session = Depends(get_session)):
- """Read all flows."""
- try:
- flows = session.exec(select(FlowStyle)).all()
- except Exception as e:
- raise HTTPException(status_code=500, detail=str(e)) from e
- return flows
-
-
-@router.get("/{flow_styles_id}", response_model=FlowStyleRead)
-def read_flow_style(*, session: Session = Depends(get_session), flow_styles_id: UUID):
- """Read a flow_style."""
- if flow_style := session.get(FlowStyle, flow_styles_id):
- return flow_style
- else:
- raise HTTPException(status_code=404, detail="FlowStyle not found")
-
-
-@router.patch("/{flow_style_id}", response_model=FlowStyleRead)
-def update_flow_style(
- *,
- session: Session = Depends(get_session),
- flow_style_id: UUID,
- flow_style: FlowStyleUpdate,
-):
- """Update a flow_style."""
- db_flow_style = session.get(FlowStyle, flow_style_id)
- if not db_flow_style:
- raise HTTPException(status_code=404, detail="FlowStyle not found")
- flow_data = flow_style.dict(exclude_unset=True)
- for key, value in flow_data.items():
- if hasattr(db_flow_style, key) and value is not None:
- setattr(db_flow_style, key, value)
- session.add(db_flow_style)
- session.commit()
- session.refresh(db_flow_style)
- return db_flow_style
-
-
-@router.delete("/{flow_id}")
-def delete_flow_style(*, session: Session = Depends(get_session), flow_id: UUID):
- """Delete a flow_style."""
- flow_style = session.get(FlowStyle, flow_id)
- if not flow_style:
- raise HTTPException(status_code=404, detail="FlowStyle not found")
- session.delete(flow_style)
- session.commit()
- return {"message": "FlowStyle deleted successfully"}
diff --git a/src/backend/langflow/api/v1/flows.py b/src/backend/langflow/api/v1/flows.py
index 4e000a128..3145ced3c 100644
--- a/src/backend/langflow/api/v1/flows.py
+++ b/src/backend/langflow/api/v1/flows.py
@@ -1,16 +1,15 @@
from typing import List
from uuid import UUID
-from langflow.settings import settings
from langflow.api.utils import remove_api_keys
from langflow.api.v1.schemas import FlowListCreate, FlowListRead
-from langflow.database.models.flow import (
+from langflow.services.database.models.flow import (
Flow,
FlowCreate,
FlowRead,
- FlowReadWithStyle,
FlowUpdate,
)
-from langflow.database.base import get_session
+from langflow.services.utils import get_session
+from langflow.services.utils import get_settings_manager
from sqlmodel import Session, select
from fastapi import APIRouter, Depends, HTTPException
from fastapi.encoders import jsonable_encoder
@@ -32,7 +31,7 @@ def create_flow(*, session: Session = Depends(get_session), flow: FlowCreate):
return db_flow
-@router.get("/", response_model=list[FlowReadWithStyle], status_code=200)
+@router.get("/", response_model=list[FlowRead], status_code=200)
def read_flows(*, session: Session = Depends(get_session)):
"""Read all flows."""
try:
@@ -42,7 +41,7 @@ def read_flows(*, session: Session = Depends(get_session)):
return [jsonable_encoder(flow) for flow in flows]
-@router.get("/{flow_id}", response_model=FlowReadWithStyle, status_code=200)
+@router.get("/{flow_id}", response_model=FlowRead, status_code=200)
def read_flow(*, session: Session = Depends(get_session), flow_id: UUID):
"""Read a flow."""
if flow := session.get(Flow, flow_id):
@@ -61,7 +60,8 @@ def update_flow(
if not db_flow:
raise HTTPException(status_code=404, detail="Flow not found")
flow_data = flow.dict(exclude_unset=True)
- if settings.remove_api_keys:
+ settings_manager = get_settings_manager()
+ if settings_manager.settings.REMOVE_API_KEYS:
flow_data = remove_api_keys(flow_data)
for key, value in flow_data.items():
setattr(db_flow, key, value)
diff --git a/src/backend/langflow/api/v1/schemas.py b/src/backend/langflow/api/v1/schemas.py
index e4b9a6e84..776e90034 100644
--- a/src/backend/langflow/api/v1/schemas.py
+++ b/src/backend/langflow/api/v1/schemas.py
@@ -1,7 +1,7 @@
from enum import Enum
from pathlib import Path
from typing import Any, Dict, List, Optional, Union
-from langflow.database.models.flow import FlowCreate, FlowRead
+from langflow.services.database.models.flow import FlowCreate, FlowRead
from pydantic import BaseModel, Field, validator
import json
@@ -116,3 +116,20 @@ class StreamData(BaseModel):
def __str__(self) -> str:
return f"event: {self.event}\ndata: {json.dumps(self.data)}\n\n"
+
+
+class CustomComponentCode(BaseModel):
+ code: str
+
+
+class CustomComponentResponseError(BaseModel):
+ detail: str
+ traceback: str
+
+
+class ComponentListCreate(BaseModel):
+ flows: List[FlowCreate]
+
+
+class ComponentListRead(BaseModel):
+ flows: List[FlowRead]
diff --git a/src/backend/langflow/cache/__init__.py b/src/backend/langflow/cache/__init__.py
deleted file mode 100644
index 723aa9e18..000000000
--- a/src/backend/langflow/cache/__init__.py
+++ /dev/null
@@ -1,7 +0,0 @@
-from langflow.cache.manager import cache_manager
-from langflow.cache.flow import InMemoryCache
-
-__all__ = [
- "cache_manager",
- "InMemoryCache",
-]
diff --git a/src/backend/langflow/components/__init__.py b/src/backend/langflow/components/__init__.py
new file mode 100644
index 000000000..765042210
--- /dev/null
+++ b/src/backend/langflow/components/__init__.py
@@ -0,0 +1,4 @@
+from langflow.interface.custom.custom_component import CustomComponent
+
+
+__all__ = ["CustomComponent"]
diff --git a/src/backend/langflow/components/chains/PromptRunner.py b/src/backend/langflow/components/chains/PromptRunner.py
new file mode 100644
index 000000000..141941c38
--- /dev/null
+++ b/src/backend/langflow/components/chains/PromptRunner.py
@@ -0,0 +1,33 @@
+from langflow import CustomComponent
+
+from langchain.llms.base import BaseLLM
+from langchain import PromptTemplate
+from langchain.schema import Document
+
+
+class PromptRunner(CustomComponent):
+ display_name: str = "Prompt Runner"
+ description: str = "Run a Chain with the given PromptTemplate"
+ beta = True
+ field_config = {
+ "llm": {"display_name": "LLM"},
+ "prompt": {
+ "display_name": "Prompt Template",
+ "info": "Make sure the prompt has all variables filled.",
+ },
+ "code": {"show": False},
+ "inputs": {"field_type": "code"},
+ }
+
+ def build(
+ self,
+ llm: BaseLLM,
+ prompt: PromptTemplate,
+ ) -> Document:
+ chain = prompt | llm
+ # The input is an empty dict because the prompt is already filled
+ result = chain.invoke({})
+ if hasattr(result, "content"):
+ result = result.content
+ self.repr_value = result
+ return Document(page_content=str(result))
diff --git a/src/backend/langflow/chat/__init__.py b/src/backend/langflow/components/chains/__init__.py
similarity index 100%
rename from src/backend/langflow/chat/__init__.py
rename to src/backend/langflow/components/chains/__init__.py
diff --git a/src/backend/langflow/components/toolkits/Metaphor.py b/src/backend/langflow/components/toolkits/Metaphor.py
new file mode 100644
index 000000000..6f43d24b4
--- /dev/null
+++ b/src/backend/langflow/components/toolkits/Metaphor.py
@@ -0,0 +1,56 @@
+from typing import List, Union
+from langflow import CustomComponent
+
+from metaphor_python import Metaphor # type: ignore
+from langchain.tools import Tool
+from langchain.agents import tool
+from langchain.agents.agent_toolkits.base import BaseToolkit
+
+
+class MetaphorToolkit(CustomComponent):
+ display_name: str = "Metaphor"
+ description: str = "Metaphor Toolkit"
+ documentation = (
+ "https://python.langchain.com/docs/integrations/tools/metaphor_search"
+ )
+ beta = True
+ # api key should be password = True
+ field_config = {
+ "metaphor_api_key": {"display_name": "Metaphor API Key", "password": True},
+ "code": {"advanced": True},
+ }
+
+ def build(
+ self,
+ metaphor_api_key: str,
+ use_autoprompt: bool = True,
+ search_num_results: int = 5,
+ similar_num_results: int = 5,
+ ) -> Union[Tool, BaseToolkit]:
+ # If documents, then we need to create a Vectara instance using .from_documents
+ client = Metaphor(api_key=metaphor_api_key)
+
+ @tool
+ def search(query: str):
+ """Call search engine with a query."""
+ return client.search(
+ query, use_autoprompt=use_autoprompt, num_results=search_num_results
+ )
+
+ @tool
+ def get_contents(ids: List[str]):
+ """Get contents of a webpage.
+
+ The ids passed in should be a list of ids as fetched from `search`.
+ """
+ return client.get_contents(ids)
+
+ @tool
+ def find_similar(url: str):
+ """Get search results similar to a given URL.
+
+ The url passed in should be a URL returned from `search`
+ """
+ return client.find_similar(url, num_results=similar_num_results)
+
+ return [search, get_contents, find_similar] # type: ignore
diff --git a/src/backend/langflow/database/__init__.py b/src/backend/langflow/components/toolkits/__init__.py
similarity index 100%
rename from src/backend/langflow/database/__init__.py
rename to src/backend/langflow/components/toolkits/__init__.py
diff --git a/src/backend/langflow/components/vectorstores/Vectara.py b/src/backend/langflow/components/vectorstores/Vectara.py
new file mode 100644
index 000000000..6edc69822
--- /dev/null
+++ b/src/backend/langflow/components/vectorstores/Vectara.py
@@ -0,0 +1,50 @@
+from typing import Optional, Union
+from langflow import CustomComponent
+
+from langchain.vectorstores import Vectara
+from langchain.schema import Document
+from langchain.vectorstores.base import VectorStore
+from langchain.schema import BaseRetriever
+from langchain.embeddings.base import Embeddings
+
+
+class VectaraComponent(CustomComponent):
+ display_name: str = "Vectara"
+ description: str = "Implementation of Vector Store using Vectara"
+ documentation = (
+ "https://python.langchain.com/docs/integrations/vectorstores/vectara"
+ )
+ beta = True
+ # api key should be password = True
+ field_config = {
+ "vectara_customer_id": {"display_name": "Vectara Customer ID"},
+ "vectara_corpus_id": {"display_name": "Vectara Corpus ID"},
+ "vectara_api_key": {"display_name": "Vectara API Key", "password": True},
+ "code": {"show": False},
+ "documents": {"display_name": "Documents"},
+ "embedding": {"display_name": "Embedding"},
+ }
+
+ def build(
+ self,
+ vectara_customer_id: str,
+ vectara_corpus_id: str,
+ vectara_api_key: str,
+ embedding: Optional[Embeddings] = None,
+ documents: Optional[Document] = None,
+ ) -> Union[VectorStore, BaseRetriever]:
+ # If documents, then we need to create a Vectara instance using .from_documents
+ if documents is not None and embedding is not None:
+ return Vectara.from_documents(
+ documents=documents, # type: ignore
+ vectara_customer_id=vectara_customer_id,
+ vectara_corpus_id=vectara_corpus_id,
+ vectara_api_key=vectara_api_key,
+ embedding=embedding,
+ )
+
+ return Vectara(
+ vectara_customer_id=vectara_customer_id,
+ vectara_corpus_id=vectara_corpus_id,
+ vectara_api_key=vectara_api_key,
+ )
diff --git a/src/backend/langflow/database/models/__init__.py b/src/backend/langflow/components/vectorstores/__init__.py
similarity index 100%
rename from src/backend/langflow/database/models/__init__.py
rename to src/backend/langflow/components/vectorstores/__init__.py
diff --git a/src/backend/langflow/config.yaml b/src/backend/langflow/config.yaml
index 3116e74c7..d25893c25 100644
--- a/src/backend/langflow/config.yaml
+++ b/src/backend/langflow/config.yaml
@@ -104,6 +104,8 @@ embeddings:
documentation: "https://python.langchain.com/docs/modules/data_connection/text_embedding/integrations/sentence_transformers"
CohereEmbeddings:
documentation: "https://python.langchain.com/docs/modules/data_connection/text_embedding/integrations/cohere"
+ VertexAIEmbeddings:
+ documentation: "https://python.langchain.com/docs/modules/data_connection/text_embedding/integrations/google_vertex_ai_palm"
llms:
OpenAI:
documentation: "https://python.langchain.com/docs/modules/model_io/models/llms/integrations/openai"
@@ -127,8 +129,8 @@ llms:
# There's a bug in this component deactivating until we get it sorted: _language_models.py", line 804, in send_message
# is_blocked=safety_attributes.get("blocked", False),
# AttributeError: 'list' object has no attribute 'get'
- # ChatVertexAI:
- # documentation: "https://python.langchain.com/docs/modules/model_io/models/chat/integrations/google_vertex_ai_palm"
+ ChatVertexAI:
+ documentation: "https://python.langchain.com/docs/modules/model_io/models/chat/integrations/google_vertex_ai_palm"
###
memories:
# https://github.com/supabase-community/supabase-py/issues/482
@@ -153,6 +155,8 @@ memories:
documentation: "https://python.langchain.com/docs/modules/memory/how_to/vectorstore_retriever_memory"
MongoDBChatMessageHistory:
documentation: "https://python.langchain.com/docs/modules/memory/integrations/mongodb_chat_message_history"
+ MotorheadMemory:
+ documentation: "https://python.langchain.com/docs/integrations/memory/motorhead_memory"
prompts:
ChatMessagePromptTemplate:
documentation: "https://python.langchain.com/docs/modules/model_io/prompts/prompt_templates/msg_prompt_templates"
@@ -290,3 +294,6 @@ output_parsers:
documentation: "https://python.langchain.com/docs/modules/model_io/output_parsers/structured"
ResponseSchema:
documentation: "https://python.langchain.com/docs/modules/model_io/output_parsers/structured"
+custom_components:
+ CustomComponent:
+ documentation: ""
diff --git a/src/backend/langflow/custom/customs.py b/src/backend/langflow/custom/customs.py
index 58ef1b508..55d855197 100644
--- a/src/backend/langflow/custom/customs.py
+++ b/src/backend/langflow/custom/customs.py
@@ -31,6 +31,9 @@ CUSTOM_NODES = {
"MidJourneyPromptChain": frontend_node.chains.MidJourneyPromptChainNode(),
"load_qa_chain": frontend_node.chains.CombineDocsChainNode(),
},
+ "custom_components": {
+ "CustomComponent": frontend_node.custom_components.CustomComponentFrontendNode(),
+ },
}
diff --git a/src/backend/langflow/database/base.py b/src/backend/langflow/database/base.py
deleted file mode 100644
index 256434523..000000000
--- a/src/backend/langflow/database/base.py
+++ /dev/null
@@ -1,37 +0,0 @@
-from langflow.settings import settings
-from sqlmodel import SQLModel, Session, create_engine
-from langflow.utils.logger import logger
-
-if settings.database_url and settings.database_url.startswith("sqlite"):
- connect_args = {"check_same_thread": False}
-else:
- connect_args = {}
-if not settings.database_url:
- raise RuntimeError("No database_url provided")
-engine = create_engine(settings.database_url, connect_args=connect_args)
-
-
-def create_db_and_tables():
- logger.debug("Creating database and tables")
- try:
- SQLModel.metadata.create_all(engine)
- except Exception as exc:
- logger.error(f"Error creating database and tables: {exc}")
- raise RuntimeError("Error creating database and tables") from exc
- # Now check if the table Flow exists, if not, something went wrong
- # and we need to create the tables again.
- from sqlalchemy import inspect
-
- inspector = inspect(engine)
- if "flow" not in inspector.get_table_names():
- logger.error("Something went wrong creating the database and tables.")
- logger.error("Please check your database settings.")
-
- raise RuntimeError("Something went wrong creating the database and tables.")
- else:
- logger.debug("Database and tables created successfully")
-
-
-def get_session():
- with Session(engine) as session:
- yield session
diff --git a/src/backend/langflow/database/models/flow_style.py b/src/backend/langflow/database/models/flow_style.py
deleted file mode 100644
index fe53799fe..000000000
--- a/src/backend/langflow/database/models/flow_style.py
+++ /dev/null
@@ -1,33 +0,0 @@
-# Path: src/backend/langflow/database/models/flowstyle.py
-
-from langflow.database.models.base import SQLModelSerializable
-from sqlmodel import Field, Relationship
-from uuid import UUID, uuid4
-from typing import TYPE_CHECKING, Optional
-
-if TYPE_CHECKING:
- from langflow.database.models.flow import Flow
-
-
-class FlowStyleBase(SQLModelSerializable):
- color: str
- emoji: str
- flow_id: UUID = Field(default=None, foreign_key="flow.id")
-
-
-class FlowStyle(FlowStyleBase, table=True):
- id: UUID = Field(default_factory=uuid4, primary_key=True, unique=True)
- flow: "Flow" = Relationship(back_populates="style")
-
-
-class FlowStyleUpdate(SQLModelSerializable):
- color: Optional[str] = None
- emoji: Optional[str] = None
-
-
-class FlowStyleCreate(FlowStyleBase):
- pass
-
-
-class FlowStyleRead(FlowStyleBase):
- id: UUID
diff --git a/src/backend/langflow/graph/graph/base.py b/src/backend/langflow/graph/graph/base.py
index 0d93dd0db..f0d3986cf 100644
--- a/src/backend/langflow/graph/graph/base.py
+++ b/src/backend/langflow/graph/graph/base.py
@@ -1,7 +1,7 @@
from typing import Dict, Generator, List, Type, Union
from langflow.graph.edge.base import Edge
-from langflow.graph.graph.constants import VERTEX_TYPE_MAP
+from langflow.graph.graph.constants import lazy_load_vertex_dict
from langflow.graph.vertex.base import Vertex
from langflow.graph.vertex.types import (
FileToolVertex,
@@ -77,6 +77,8 @@ class Graph:
def _validate_nodes(self) -> None:
"""Check that all nodes have edges"""
+ if len(self.nodes) == 1:
+ return
for node in self.nodes:
if not self._validate_node(node):
raise ValueError(
@@ -185,10 +187,12 @@ class Graph:
"""Returns the node class based on the node type."""
if node_type in FILE_TOOLS:
return FileToolVertex
- if node_type in VERTEX_TYPE_MAP:
- return VERTEX_TYPE_MAP[node_type]
+ if node_type in lazy_load_vertex_dict.VERTEX_TYPE_MAP:
+ return lazy_load_vertex_dict.VERTEX_TYPE_MAP[node_type]
return (
- VERTEX_TYPE_MAP[node_lc_type] if node_lc_type in VERTEX_TYPE_MAP else Vertex
+ lazy_load_vertex_dict.VERTEX_TYPE_MAP[node_lc_type]
+ if node_lc_type in lazy_load_vertex_dict.VERTEX_TYPE_MAP
+ else Vertex
)
def _build_vertices(self) -> List[Vertex]:
diff --git a/src/backend/langflow/graph/graph/constants.py b/src/backend/langflow/graph/graph/constants.py
index a2fd287eb..c9fea48b5 100644
--- a/src/backend/langflow/graph/graph/constants.py
+++ b/src/backend/langflow/graph/graph/constants.py
@@ -1,4 +1,3 @@
-from langflow.graph.vertex.base import Vertex
from langflow.graph.vertex import types
from langflow.interface.agents.base import agent_creator
from langflow.interface.chains.base import chain_creator
@@ -14,23 +13,46 @@ from langflow.interface.vector_store.base import vectorstore_creator
from langflow.interface.wrappers.base import wrapper_creator
from langflow.interface.output_parsers.base import output_parser_creator
from langflow.interface.retrievers.base import retriever_creator
-
-from typing import Dict, Type
+from langflow.interface.custom.base import custom_component_creator
+from langflow.utils.lazy_load import LazyLoadDictBase
-VERTEX_TYPE_MAP: Dict[str, Type[Vertex]] = {
- **{t: types.PromptVertex for t in prompt_creator.to_list()},
- **{t: types.AgentVertex for t in agent_creator.to_list()},
- **{t: types.ChainVertex for t in chain_creator.to_list()},
- **{t: types.ToolVertex for t in tool_creator.to_list()},
- **{t: types.ToolkitVertex for t in toolkits_creator.to_list()},
- **{t: types.WrapperVertex for t in wrapper_creator.to_list()},
- **{t: types.LLMVertex for t in llm_creator.to_list()},
- **{t: types.MemoryVertex for t in memory_creator.to_list()},
- **{t: types.EmbeddingVertex for t in embedding_creator.to_list()},
- **{t: types.VectorStoreVertex for t in vectorstore_creator.to_list()},
- **{t: types.DocumentLoaderVertex for t in documentloader_creator.to_list()},
- **{t: types.TextSplitterVertex for t in textsplitter_creator.to_list()},
- **{t: types.OutputParserVertex for t in output_parser_creator.to_list()},
- **{t: types.RetrieverVertex for t in retriever_creator.to_list()},
-}
+class VertexTypesDict(LazyLoadDictBase):
+ def __init__(self):
+ self._all_types_dict = None
+
+ @property
+ def VERTEX_TYPE_MAP(self):
+ return self.all_types_dict
+
+ def _build_dict(self):
+ langchain_types_dict = self.get_type_dict()
+ return {
+ **langchain_types_dict,
+ "Custom": ["Custom Tool", "Python Function"],
+ }
+
+ def get_type_dict(self):
+ return {
+ **{t: types.PromptVertex for t in prompt_creator.to_list()},
+ **{t: types.AgentVertex for t in agent_creator.to_list()},
+ **{t: types.ChainVertex for t in chain_creator.to_list()},
+ **{t: types.ToolVertex for t in tool_creator.to_list()},
+ **{t: types.ToolkitVertex for t in toolkits_creator.to_list()},
+ **{t: types.WrapperVertex for t in wrapper_creator.to_list()},
+ **{t: types.LLMVertex for t in llm_creator.to_list()},
+ **{t: types.MemoryVertex for t in memory_creator.to_list()},
+ **{t: types.EmbeddingVertex for t in embedding_creator.to_list()},
+ **{t: types.VectorStoreVertex for t in vectorstore_creator.to_list()},
+ **{t: types.DocumentLoaderVertex for t in documentloader_creator.to_list()},
+ **{t: types.TextSplitterVertex for t in textsplitter_creator.to_list()},
+ **{t: types.OutputParserVertex for t in output_parser_creator.to_list()},
+ **{
+ t: types.CustomComponentVertex
+ for t in custom_component_creator.to_list()
+ },
+ **{t: types.RetrieverVertex for t in retriever_creator.to_list()},
+ }
+
+
+lazy_load_vertex_dict = VertexTypesDict()
diff --git a/src/backend/langflow/graph/vertex/base.py b/src/backend/langflow/graph/vertex/base.py
index 2c749b85f..ac7f72b4d 100644
--- a/src/backend/langflow/graph/vertex/base.py
+++ b/src/backend/langflow/graph/vertex/base.py
@@ -1,5 +1,6 @@
+import ast
from langflow.interface.initialize import loading
-from langflow.interface.listing import ALL_TYPES_DICT
+from langflow.interface.listing import lazy_load_dict
from langflow.utils.constants import DIRECT_TYPES
from langflow.utils.logger import logger
from langflow.utils.util import sync_to_async
@@ -61,7 +62,7 @@ class Vertex:
)
if self.base_type is None:
- for base_type, value in ALL_TYPES_DICT.items():
+ for base_type, value in lazy_load_dict.ALL_TYPES_DICT.items():
if self.vertex_type in value:
self.base_type = base_type
break
@@ -100,7 +101,9 @@ class Vertex:
params[param_key] = edge.source
for key, value in template_dict.items():
- if key == "_type" or not value.get("show"):
+ # Skip _type and any value that has show == False and is not code
+ # If we don't want to show code but we want to use it
+ if key == "_type" or (not value.get("show") and key != "code"):
continue
# If the type is not transformable to a python base class
# then we need to get the edge that connects to this node
@@ -112,7 +115,14 @@ class Vertex:
params[key] = file_path
elif value.get("type") in DIRECT_TYPES and params.get(key) is None:
- params[key] = value.get("value")
+ if value.get("type") == "code":
+ try:
+ params[key] = ast.literal_eval(value.get("value"))
+ except Exception as exc:
+ logger.debug(f"Error parsing code: {exc}")
+ params[key] = value.get("value")
+ else:
+ params[key] = value.get("value")
if not value.get("required") and params.get(key) is None:
if value.get("default"):
@@ -259,4 +269,8 @@ class Vertex:
def _built_object_repr(self):
# Add a message with an emoji, stars for sucess,
- return "Built sucessfully โจ" if self._built_object else "Failed to build ๐ตโ๐ซ"
+ return (
+ "Built sucessfully โจ"
+ if self._built_object is not None
+ else "Failed to build ๐ตโ๐ซ"
+ )
diff --git a/src/backend/langflow/graph/vertex/types.py b/src/backend/langflow/graph/vertex/types.py
index 20ec3e66d..b7ac17983 100644
--- a/src/backend/langflow/graph/vertex/types.py
+++ b/src/backend/langflow/graph/vertex/types.py
@@ -226,7 +226,11 @@ class PromptVertex(Vertex):
# so the prompt format doesn't break
artifacts.pop("handle_keys", None)
try:
- template = self._built_object.format(**artifacts)
+ template = self._built_object.template
+ for key, value in artifacts.items():
+ if value:
+ replace_key = "{" + key + "}"
+ template = template.replace(replace_key, value)
return (
template
if isinstance(template, str)
@@ -239,3 +243,12 @@ class PromptVertex(Vertex):
class OutputParserVertex(Vertex):
def __init__(self, data: Dict):
super().__init__(data, base_type="output_parsers")
+
+
+class CustomComponentVertex(Vertex):
+ def __init__(self, data: Dict):
+ super().__init__(data, base_type="custom_components")
+
+ def _built_object_repr(self):
+ if self.artifacts and "repr" in self.artifacts:
+ return self.artifacts["repr"] or super()._built_object_repr()
diff --git a/src/backend/langflow/interface/agents/base.py b/src/backend/langflow/interface/agents/base.py
index b272144bc..ec8c42aba 100644
--- a/src/backend/langflow/interface/agents/base.py
+++ b/src/backend/langflow/interface/agents/base.py
@@ -5,7 +5,8 @@ from langchain.agents import types
from langflow.custom.customs import get_custom_nodes
from langflow.interface.agents.custom import CUSTOM_AGENTS
from langflow.interface.base import LangChainTypeCreator
-from langflow.settings import settings
+from langflow.services.utils import get_settings_manager
+
from langflow.template.frontend_node.agents import AgentFrontendNode
from langflow.utils.logger import logger
from langflow.utils.util import build_template_from_class, build_template_from_method
@@ -53,13 +54,17 @@ class AgentCreator(LangChainTypeCreator):
# Now this is a generator
def to_list(self) -> List[str]:
names = []
+ settings_manager = get_settings_manager()
for _, agent in self.type_to_loader_dict.items():
agent_name = (
agent.function_name()
if hasattr(agent, "function_name")
else agent.__name__
)
- if agent_name in settings.agents or settings.dev:
+ if (
+ agent_name in settings_manager.settings.AGENTS
+ or settings_manager.settings.DEV
+ ):
names.append(agent_name)
return names
diff --git a/src/backend/langflow/interface/base.py b/src/backend/langflow/interface/base.py
index 6e1522dd2..d1ed83b5a 100644
--- a/src/backend/langflow/interface/base.py
+++ b/src/backend/langflow/interface/base.py
@@ -2,13 +2,14 @@ from abc import ABC, abstractmethod
from typing import Any, Dict, List, Optional, Type, Union
from langchain.chains.base import Chain
from langchain.agents import AgentExecutor
+from langflow.services.utils import get_settings_manager
from pydantic import BaseModel
from langflow.template.field.base import TemplateField
from langflow.template.frontend_node.base import FrontendNode
from langflow.template.template.base import Template
from langflow.utils.logger import logger
-from langflow.settings import settings
+
# Assuming necessary imports for Field, Template, and FrontendNode classes
@@ -26,15 +27,18 @@ class LangChainTypeCreator(BaseModel, ABC):
@property
def docs_map(self) -> Dict[str, str]:
"""A dict with the name of the component as key and the documentation link as value."""
+ settings_manager = get_settings_manager()
if self.name_docs_dict is None:
try:
- type_settings = getattr(settings, self.type_name)
+ type_settings = getattr(
+ settings_manager.settings, self.type_name.upper()
+ )
self.name_docs_dict = {
name: value_dict["documentation"]
for name, value_dict in type_settings.items()
}
except AttributeError as exc:
- logger.error(exc)
+ logger.error(f"Error getting settings for {self.type_name}: {exc}")
self.name_docs_dict = {}
return self.name_docs_dict
diff --git a/src/backend/langflow/interface/chains/base.py b/src/backend/langflow/interface/chains/base.py
index 67d31308f..b906dbd25 100644
--- a/src/backend/langflow/interface/chains/base.py
+++ b/src/backend/langflow/interface/chains/base.py
@@ -3,11 +3,13 @@ from typing import Any, Dict, List, Optional, Type
from langflow.custom.customs import get_custom_nodes
from langflow.interface.base import LangChainTypeCreator
from langflow.interface.importing.utils import import_class
-from langflow.settings import settings
+from langflow.services.utils import get_settings_manager
+
from langflow.template.frontend_node.chains import ChainFrontendNode
from langflow.utils.logger import logger
from langflow.utils.util import build_template_from_class, build_template_from_method
from langchain import chains
+from langchain_experimental.sql import SQLDatabaseChain # type: ignore
# Assuming necessary imports for Field, Template, and FrontendNode classes
@@ -29,18 +31,22 @@ class ChainCreator(LangChainTypeCreator):
@property
def type_to_loader_dict(self) -> Dict:
if self.type_dict is None:
+ settings_manager = get_settings_manager()
self.type_dict: dict[str, Any] = {
chain_name: import_class(f"langchain.chains.{chain_name}")
for chain_name in chains.__all__
}
from langflow.interface.chains.custom import CUSTOM_CHAINS
+ self.type_dict["SQLDatabaseChain"] = SQLDatabaseChain
+
self.type_dict.update(CUSTOM_CHAINS)
# Filter according to settings.chains
self.type_dict = {
name: chain
for name, chain in self.type_dict.items()
- if name in settings.chains or settings.dev
+ if name in settings_manager.settings.CHAINS
+ or settings_manager.settings.DEV
}
return self.type_dict
diff --git a/src/backend/langflow/interface/custom/__init__.py b/src/backend/langflow/interface/custom/__init__.py
new file mode 100644
index 000000000..5b87e9fa3
--- /dev/null
+++ b/src/backend/langflow/interface/custom/__init__.py
@@ -0,0 +1,4 @@
+from langflow.interface.custom.base import CustomComponentCreator
+from langflow.interface.custom.custom_component import CustomComponent
+
+__all__ = ["CustomComponentCreator", "CustomComponent"]
diff --git a/src/backend/langflow/interface/custom/base.py b/src/backend/langflow/interface/custom/base.py
new file mode 100644
index 000000000..06e874fa7
--- /dev/null
+++ b/src/backend/langflow/interface/custom/base.py
@@ -0,0 +1,48 @@
+from typing import Any, Dict, List, Optional, Type
+
+
+from langflow.interface.base import LangChainTypeCreator
+
+# from langflow.interface.custom.custom import CustomComponent
+from langflow.interface.custom.custom_component import CustomComponent
+from langflow.template.frontend_node.custom_components import (
+ CustomComponentFrontendNode,
+)
+from langflow.utils.logger import logger
+
+# Assuming necessary imports for Field, Template, and FrontendNode classes
+
+
+class CustomComponentCreator(LangChainTypeCreator):
+ type_name: str = "custom_components"
+
+ @property
+ def frontend_node_class(self) -> Type[CustomComponentFrontendNode]:
+ return CustomComponentFrontendNode
+
+ @property
+ def type_to_loader_dict(self) -> Dict:
+ if self.type_dict is None:
+ self.type_dict: dict[str, Any] = {
+ "CustomComponent": CustomComponent,
+ }
+ return self.type_dict
+
+ def get_signature(self, name: str) -> Optional[Dict]:
+ from langflow.custom.customs import get_custom_nodes
+
+ try:
+ if name in get_custom_nodes(self.type_name).keys():
+ return get_custom_nodes(self.type_name)[name]
+ except ValueError as exc:
+ raise ValueError(f"CustomComponent {name} not found: {exc}") from exc
+ except AttributeError as exc:
+ logger.error(f"CustomComponent {name} not loaded: {exc}")
+ return None
+ return None
+
+ def to_list(self) -> List[str]:
+ return list(self.type_to_loader_dict.keys())
+
+
+custom_component_creator = CustomComponentCreator()
diff --git a/src/backend/langflow/interface/custom/code_parser.py b/src/backend/langflow/interface/custom/code_parser.py
new file mode 100644
index 000000000..d42f82635
--- /dev/null
+++ b/src/backend/langflow/interface/custom/code_parser.py
@@ -0,0 +1,272 @@
+import ast
+import inspect
+import traceback
+
+from typing import Dict, Any, List, Type, Union
+from fastapi import HTTPException
+from langflow.interface.custom.schema import CallableCodeDetails, ClassCodeDetails
+
+
+class CodeSyntaxError(HTTPException):
+ pass
+
+
+class CodeParser:
+ """
+ A parser for Python source code, extracting code details.
+ """
+
+ def __init__(self, code: Union[str, Type]) -> None:
+ """
+ Initializes the parser with the provided code.
+ """
+ if isinstance(code, type):
+ if not inspect.isclass(code):
+ raise ValueError("The provided code must be a class.")
+ # If the code is a class, get its source code
+ code = inspect.getsource(code)
+ self.code = code
+ self.data: Dict[str, Any] = {
+ "imports": [],
+ "functions": [],
+ "classes": [],
+ "global_vars": [],
+ }
+ self.handlers = {
+ ast.Import: self.parse_imports,
+ ast.ImportFrom: self.parse_imports,
+ ast.FunctionDef: self.parse_functions,
+ ast.ClassDef: self.parse_classes,
+ ast.Assign: self.parse_global_vars,
+ }
+
+ def __get_tree(self):
+ """
+ Parses the provided code to validate its syntax.
+ It tries to parse the code into an abstract syntax tree (AST).
+ """
+ try:
+ tree = ast.parse(self.code)
+ except SyntaxError as err:
+ raise CodeSyntaxError(
+ status_code=400,
+ detail={"error": err.msg, "traceback": traceback.format_exc()},
+ ) from err
+
+ return tree
+
+ def parse_node(self, node: Union[ast.stmt, ast.AST]) -> None:
+ """
+ Parses an AST node and updates the data
+ dictionary with the relevant information.
+ """
+ if handler := self.handlers.get(type(node)): # type: ignore
+ handler(node) # type: ignore
+
+ def parse_imports(self, node: Union[ast.Import, ast.ImportFrom]) -> None:
+ """
+ Extracts "imports" from the code.
+ """
+ if isinstance(node, ast.Import):
+ for alias in node.names:
+ self.data["imports"].append(alias.name)
+ elif isinstance(node, ast.ImportFrom):
+ for alias in node.names:
+ self.data["imports"].append((node.module, alias.name))
+
+ def parse_functions(self, node: ast.FunctionDef) -> None:
+ """
+ Extracts "functions" from the code.
+ """
+ self.data["functions"].append(self.parse_callable_details(node))
+
+ def parse_arg(self, arg, default):
+ """
+ Parses an argument and its default value.
+ """
+ arg_dict = {"name": arg.arg, "default": default}
+ if arg.annotation:
+ arg_dict["type"] = ast.unparse(arg.annotation)
+ return arg_dict
+
+ def parse_callable_details(self, node: ast.FunctionDef) -> Dict[str, Any]:
+ """
+ Extracts details from a single function or method node.
+ """
+ func = CallableCodeDetails(
+ name=node.name,
+ doc=ast.get_docstring(node),
+ args=[],
+ body=[],
+ return_type=ast.unparse(node.returns) if node.returns else None,
+ )
+
+ func.args = self.parse_function_args(node)
+ func.body = self.parse_function_body(node)
+
+ return func.dict()
+
+ def parse_function_args(self, node: ast.FunctionDef) -> List[Dict[str, Any]]:
+ """
+ Parses the arguments of a function or method node.
+ """
+ args = []
+
+ args += self.parse_positional_args(node)
+ args += self.parse_varargs(node)
+ args += self.parse_keyword_args(node)
+ args += self.parse_kwargs(node)
+
+ return args
+
+ def parse_positional_args(self, node: ast.FunctionDef) -> List[Dict[str, Any]]:
+ """
+ Parses the positional arguments of a function or method node.
+ """
+ num_args = len(node.args.args)
+ num_defaults = len(node.args.defaults)
+ num_missing_defaults = num_args - num_defaults
+ missing_defaults = [None] * num_missing_defaults
+ default_values = [
+ ast.unparse(default).strip("'") if default else None
+ for default in node.args.defaults
+ ]
+ # Now check all default values to see if there
+ # are any "None" values in the middle
+ default_values = [
+ None if value == "None" else value for value in default_values
+ ]
+
+ defaults = missing_defaults + default_values
+
+ args = [
+ self.parse_arg(arg, default)
+ for arg, default in zip(node.args.args, defaults)
+ ]
+ return args
+
+ def parse_varargs(self, node: ast.FunctionDef) -> List[Dict[str, Any]]:
+ """
+ Parses the *args argument of a function or method node.
+ """
+ args = []
+
+ if node.args.vararg:
+ args.append(self.parse_arg(node.args.vararg, None))
+
+ return args
+
+ def parse_keyword_args(self, node: ast.FunctionDef) -> List[Dict[str, Any]]:
+ """
+ Parses the keyword-only arguments of a function or method node.
+ """
+ kw_defaults = [None] * (
+ len(node.args.kwonlyargs) - len(node.args.kw_defaults)
+ ) + [
+ ast.unparse(default) if default else None
+ for default in node.args.kw_defaults
+ ]
+
+ args = [
+ self.parse_arg(arg, default)
+ for arg, default in zip(node.args.kwonlyargs, kw_defaults)
+ ]
+ return args
+
+ def parse_kwargs(self, node: ast.FunctionDef) -> List[Dict[str, Any]]:
+ """
+ Parses the **kwargs argument of a function or method node.
+ """
+ args = []
+
+ if node.args.kwarg:
+ args.append(self.parse_arg(node.args.kwarg, None))
+
+ return args
+
+ def parse_function_body(self, node: ast.FunctionDef) -> List[str]:
+ """
+ Parses the body of a function or method node.
+ """
+ return [ast.unparse(line) for line in node.body]
+
+ def parse_assign(self, stmt):
+ """
+ Parses an Assign statement and returns a dictionary
+ with the target's name and value.
+ """
+ for target in stmt.targets:
+ if isinstance(target, ast.Name):
+ return {"name": target.id, "value": ast.unparse(stmt.value)}
+
+ def parse_ann_assign(self, stmt):
+ """
+ Parses an AnnAssign statement and returns a dictionary
+ with the target's name, value, and annotation.
+ """
+ if isinstance(stmt.target, ast.Name):
+ return {
+ "name": stmt.target.id,
+ "value": ast.unparse(stmt.value) if stmt.value else None,
+ "annotation": ast.unparse(stmt.annotation),
+ }
+
+ def parse_function_def(self, stmt):
+ """
+ Parses a FunctionDef statement and returns the parsed
+ method and a boolean indicating if it's an __init__ method.
+ """
+ method = self.parse_callable_details(stmt)
+ return (method, True) if stmt.name == "__init__" else (method, False)
+
+ def parse_classes(self, node: ast.ClassDef) -> None:
+ """
+ Extracts "classes" from the code, including inheritance and init methods.
+ """
+
+ class_details = ClassCodeDetails(
+ name=node.name,
+ doc=ast.get_docstring(node),
+ bases=[ast.unparse(base) for base in node.bases],
+ attributes=[],
+ methods=[],
+ init=None,
+ )
+
+ for stmt in node.body:
+ if isinstance(stmt, ast.Assign):
+ if attr := self.parse_assign(stmt):
+ class_details.attributes.append(attr)
+ elif isinstance(stmt, ast.AnnAssign):
+ if attr := self.parse_ann_assign(stmt):
+ class_details.attributes.append(attr)
+ elif isinstance(stmt, ast.FunctionDef):
+ method, is_init = self.parse_function_def(stmt)
+ if is_init:
+ class_details.init = method
+ else:
+ class_details.methods.append(method)
+
+ self.data["classes"].append(class_details.dict())
+
+ def parse_global_vars(self, node: ast.Assign) -> None:
+ """
+ Extracts global variables from the code.
+ """
+ global_var = {
+ "targets": [
+ t.id if hasattr(t, "id") else ast.dump(t) for t in node.targets
+ ],
+ "value": ast.unparse(node.value),
+ }
+ self.data["global_vars"].append(global_var)
+
+ def parse_code(self) -> Dict[str, Any]:
+ """
+ Runs all parsing operations and returns the resulting data.
+ """
+ tree = self.__get_tree()
+
+ for node in ast.walk(tree):
+ self.parse_node(node)
+ return self.data
diff --git a/src/backend/langflow/interface/custom/component.py b/src/backend/langflow/interface/custom/component.py
new file mode 100644
index 000000000..06db5bd46
--- /dev/null
+++ b/src/backend/langflow/interface/custom/component.py
@@ -0,0 +1,75 @@
+import ast
+from typing import Any, Optional
+from pydantic import BaseModel
+from fastapi import HTTPException
+
+from langflow.utils import validate
+from langflow.interface.custom.code_parser import CodeParser
+
+
+class ComponentCodeNullError(HTTPException):
+ pass
+
+
+class ComponentFunctionEntrypointNameNullError(HTTPException):
+ pass
+
+
+class Component(BaseModel):
+ ERROR_CODE_NULL = "Python code must be provided."
+ ERROR_FUNCTION_ENTRYPOINT_NAME_NULL = (
+ "The name of the entrypoint function must be provided."
+ )
+
+ code: Optional[str]
+ function_entrypoint_name = "build"
+ field_config: dict = {}
+
+ def __init__(self, **data):
+ super().__init__(**data)
+
+ def get_code_tree(self, code: str):
+ parser = CodeParser(code)
+ return parser.parse_code()
+
+ def get_function(self):
+ if not self.code:
+ raise ComponentCodeNullError(
+ status_code=400,
+ detail={"error": self.ERROR_CODE_NULL, "traceback": ""},
+ )
+
+ if not self.function_entrypoint_name:
+ raise ComponentFunctionEntrypointNameNullError(
+ status_code=400,
+ detail={
+ "error": self.ERROR_FUNCTION_ENTRYPOINT_NAME_NULL,
+ "traceback": "",
+ },
+ )
+
+ return validate.create_function(self.code, self.function_entrypoint_name)
+
+ def build_template_config(self, attributes) -> dict:
+ template_config = {}
+
+ for item in attributes:
+ item_name = item.get("name")
+
+ if item_value := item.get("value"):
+ if "display_name" in item_name:
+ template_config["display_name"] = ast.literal_eval(item_value)
+
+ elif "description" in item_name:
+ template_config["description"] = ast.literal_eval(item_value)
+
+ elif "beta" in item_name:
+ template_config["beta"] = ast.literal_eval(item_value)
+
+ elif "documentation" in item_name:
+ template_config["documentation"] = ast.literal_eval(item_value)
+
+ return template_config
+
+ def build(self, *args: Any, **kwargs: Any) -> Any:
+ raise NotImplementedError
diff --git a/src/backend/langflow/interface/custom/constants.py b/src/backend/langflow/interface/custom/constants.py
new file mode 100644
index 000000000..83cf4b463
--- /dev/null
+++ b/src/backend/langflow/interface/custom/constants.py
@@ -0,0 +1,60 @@
+from langchain import PromptTemplate
+from langchain.chains.base import Chain
+from langchain.document_loaders.base import BaseLoader
+from langchain.embeddings.base import Embeddings
+from langchain.llms.base import BaseLLM
+from langchain.schema import BaseRetriever, Document
+from langchain.text_splitter import TextSplitter
+from langchain.tools import Tool
+from langchain.vectorstores.base import VectorStore
+from langchain.schema import BaseOutputParser
+
+
+LANGCHAIN_BASE_TYPES = {
+ "Chain": Chain,
+ "Tool": Tool,
+ "BaseLLM": BaseLLM,
+ "PromptTemplate": PromptTemplate,
+ "BaseLoader": BaseLoader,
+ "Document": Document,
+ "TextSplitter": TextSplitter,
+ "VectorStore": VectorStore,
+ "Embeddings": Embeddings,
+ "BaseRetriever": BaseRetriever,
+ "BaseOutputParser": BaseOutputParser,
+}
+
+# Langchain base types plus Python base types
+CUSTOM_COMPONENT_SUPPORTED_TYPES = {
+ **LANGCHAIN_BASE_TYPES,
+ "str": str,
+ "int": int,
+ "float": float,
+ "bool": bool,
+ "list": list,
+ "dict": dict,
+}
+
+
+DEFAULT_CUSTOM_COMPONENT_CODE = """from langflow import CustomComponent
+
+from langchain.llms.base import BaseLLM
+from langchain.chains import LLMChain
+from langchain import PromptTemplate
+from langchain.schema import Document
+
+import requests
+
+class YourComponent(CustomComponent):
+ display_name: str = "Custom Component"
+ description: str = "Create any custom component you want!"
+
+ def build_config(self):
+ return { "url": { "multiline": True, "required": True } }
+
+ def build(self, url: str, llm: BaseLLM, prompt: PromptTemplate) -> Document:
+ response = requests.get(url)
+ chain = LLMChain(llm=llm, prompt=prompt)
+ result = chain.run(response.text[:300])
+ return Document(page_content=str(result))
+"""
diff --git a/src/backend/langflow/interface/custom/custom_component.py b/src/backend/langflow/interface/custom/custom_component.py
new file mode 100644
index 000000000..4b3b11ed0
--- /dev/null
+++ b/src/backend/langflow/interface/custom/custom_component.py
@@ -0,0 +1,214 @@
+from typing import Any, Callable, List, Optional
+from fastapi import HTTPException
+from langflow.interface.custom.constants import CUSTOM_COMPONENT_SUPPORTED_TYPES
+from langflow.interface.custom.component import Component
+from langflow.interface.custom.directory_reader import DirectoryReader
+from langflow.services.utils import get_db_manager
+
+from langflow.utils import validate
+
+from langflow.services.database.utils import session_getter
+from langflow.services.database.models.flow import Flow
+from pydantic import Extra
+import yaml
+
+
+class CustomComponent(Component, extra=Extra.allow):
+ code: Optional[str]
+ field_config: dict = {}
+ code_class_base_inheritance = "CustomComponent"
+ function_entrypoint_name = "build"
+ function: Optional[Callable] = None
+ return_type_valid_list = list(CUSTOM_COMPONENT_SUPPORTED_TYPES.keys())
+ repr_value: Optional[str] = ""
+
+ def __init__(self, **data):
+ super().__init__(**data)
+
+ def custom_repr(self):
+ if isinstance(self.repr_value, dict):
+ return yaml.dump(self.repr_value)
+ if isinstance(self.repr_value, str):
+ return self.repr_value
+ return str(self.repr_value)
+
+ def build_config(self):
+ return self.field_config
+
+ def _class_template_validation(self, code: str):
+ TYPE_HINT_LIST = ["Optional", "Prompt", "PromptTemplate", "LLMChain"]
+
+ if not code:
+ raise HTTPException(
+ status_code=400,
+ detail={
+ "error": self.ERROR_CODE_NULL,
+ "traceback": "",
+ },
+ )
+
+ reader = DirectoryReader("", False)
+
+ for type_hint in TYPE_HINT_LIST:
+ if reader._is_type_hint_used_in_args(
+ "Optional", code
+ ) and not reader._is_type_hint_imported("Optional", code):
+ error_detail = {
+ "error": "Type hint Error",
+ "traceback": f"Type hint '{type_hint}' is used but not imported in the code.",
+ }
+ raise HTTPException(status_code=400, detail=error_detail)
+
+ def is_check_valid(self) -> bool:
+ return self._class_template_validation(self.code) if self.code else False
+
+ def get_code_tree(self, code: str):
+ return super().get_code_tree(code)
+
+ @property
+ def get_function_entrypoint_args(self) -> str:
+ if not self.code:
+ return ""
+ tree = self.get_code_tree(self.code)
+
+ component_classes = [
+ cls
+ for cls in tree["classes"]
+ if self.code_class_base_inheritance in cls["bases"]
+ ]
+ if not component_classes:
+ return ""
+
+ # Assume the first Component class is the one we're interested in
+ component_class = component_classes[0]
+ build_methods = [
+ method
+ for method in component_class["methods"]
+ if method["name"] == self.function_entrypoint_name
+ ]
+
+ if not build_methods:
+ return ""
+
+ build_method = build_methods[0]
+
+ return build_method["args"]
+
+ @property
+ def get_function_entrypoint_return_type(self) -> List[str]:
+ if not self.code:
+ return []
+ tree = self.get_code_tree(self.code)
+
+ component_classes = [
+ cls
+ for cls in tree["classes"]
+ if self.code_class_base_inheritance in cls["bases"]
+ ]
+ if not component_classes:
+ return []
+
+ # Assume the first Component class is the one we're interested in
+ component_class = component_classes[0]
+ build_methods = [
+ method
+ for method in component_class["methods"]
+ if method["name"] == self.function_entrypoint_name
+ ]
+
+ if not build_methods:
+ return []
+
+ build_method = build_methods[0]
+ return_type = build_method["return_type"]
+ if not return_type:
+ return []
+ # If the return type is not a Union, then we just return it as a list
+ if "Union" not in return_type:
+ return [return_type] if return_type in self.return_type_valid_list else []
+
+ # If the return type is a Union, then we need to parse it
+ return_type = return_type.replace("Union", "").replace("[", "").replace("]", "")
+ return_type = return_type.split(",")
+ return_type = [item.strip() for item in return_type]
+ return [item for item in return_type if item in self.return_type_valid_list]
+
+ @property
+ def get_main_class_name(self):
+ tree = self.get_code_tree(self.code)
+
+ base_name = self.code_class_base_inheritance
+ method_name = self.function_entrypoint_name
+
+ classes = []
+ for item in tree.get("classes"):
+ if base_name in item["bases"]:
+ method_names = [method["name"] for method in item["methods"]]
+ if method_name in method_names:
+ classes.append(item["name"])
+
+ # Get just the first item
+ return next(iter(classes), "")
+
+ @property
+ def build_template_config(self):
+ tree = self.get_code_tree(self.code)
+
+ attributes = [
+ main_class["attributes"]
+ for main_class in tree.get("classes")
+ if main_class["name"] == self.get_main_class_name
+ ]
+ # Get just the first item
+ attributes = next(iter(attributes), [])
+
+ return super().build_template_config(attributes)
+
+ @property
+ def get_function(self):
+ return validate.create_function(self.code, self.function_entrypoint_name)
+
+ def load_flow(self, flow_id: str, tweaks: Optional[dict] = None) -> Any:
+ from langflow.processing.process import build_sorted_vertices_with_caching
+ from langflow.processing.process import process_tweaks
+
+ db_manager = get_db_manager()
+ with session_getter(db_manager) as session:
+ graph_data = flow.data if (flow := session.get(Flow, flow_id)) else None
+ if not graph_data:
+ raise ValueError(f"Flow {flow_id} not found")
+ if tweaks:
+ graph_data = process_tweaks(graph_data=graph_data, tweaks=tweaks)
+ return build_sorted_vertices_with_caching(graph_data)
+
+ def list_flows(self, *, get_session: Optional[Callable] = None) -> List[Flow]:
+ get_session = get_session or session_getter
+ db_manager = get_db_manager()
+ with get_session(db_manager) as session:
+ flows = session.query(Flow).all()
+ return flows
+
+ def get_flow(
+ self,
+ *,
+ flow_name: Optional[str] = None,
+ flow_id: Optional[str] = None,
+ tweaks: Optional[dict] = None,
+ get_session: Optional[Callable] = None,
+ ) -> Flow:
+ get_session = get_session or session_getter
+ db_manager = get_db_manager()
+ with get_session(db_manager) as session:
+ if flow_id:
+ flow = session.query(Flow).get(flow_id)
+ elif flow_name:
+ flow = session.query(Flow).filter(Flow.name == flow_name).first()
+ else:
+ raise ValueError("Either flow_name or flow_id must be provided")
+
+ if not flow:
+ raise ValueError(f"Flow {flow_name or flow_id} not found")
+ return self.load_flow(flow.id, tweaks)
+
+ def build(self, *args: Any, **kwargs: Any) -> Any:
+ raise NotImplementedError
diff --git a/src/backend/langflow/interface/custom/directory_reader.py b/src/backend/langflow/interface/custom/directory_reader.py
new file mode 100644
index 000000000..7bff7b5f5
--- /dev/null
+++ b/src/backend/langflow/interface/custom/directory_reader.py
@@ -0,0 +1,272 @@
+import os
+import ast
+import zlib
+from langflow.utils.logger import logger
+
+
+class CustomComponentPathValueError(ValueError):
+ pass
+
+
+class StringCompressor:
+ def __init__(self, input_string):
+ """Initialize StringCompressor with a string to compress."""
+ self.input_string = input_string
+
+ def compress_string(self):
+ """
+ Compress the initial string and return the compressed data.
+ """
+ # Convert string to bytes
+ byte_data = self.input_string.encode("utf-8")
+ # Compress the bytes
+ self.compressed_data = zlib.compress(byte_data)
+
+ return self.compressed_data
+
+ def decompress_string(self):
+ """
+ Decompress the compressed data and return the original string.
+ """
+ # Decompress the bytes
+ decompressed_data = zlib.decompress(self.compressed_data)
+ # Convert bytes back to string
+ return decompressed_data.decode("utf-8")
+
+
+class DirectoryReader:
+ # Ensure the base path to read the files that contain
+ # the custom components from this directory.
+ base_path = ""
+
+ def __init__(self, directory_path, compress_code_field=False):
+ """
+ Initialize DirectoryReader with a directory path
+ and a flag indicating whether to compress the code.
+ """
+ self.directory_path = directory_path
+ self.compress_code_field = compress_code_field
+
+ def get_safe_path(self):
+ """Check if the path is valid and return it, or None if it's not."""
+ return self.directory_path if self.is_valid_path() else None
+
+ def is_valid_path(self) -> bool:
+ """Check if the directory path is valid by comparing it to the base path."""
+ fullpath = os.path.normpath(os.path.join(self.directory_path))
+ return fullpath.startswith(self.base_path)
+
+ def is_empty_file(self, file_content):
+ """
+ Check if the file content is empty.
+ """
+ return len(file_content.strip()) == 0
+
+ def filter_loaded_components(self, data: dict, with_errors: bool) -> dict:
+ items = [
+ {
+ "name": menu["name"],
+ "path": menu["path"],
+ "components": [
+ component
+ for component in menu["components"]
+ if (component["error"] if with_errors else not component["error"])
+ ],
+ }
+ for menu in data["menu"]
+ ]
+ filtered = [menu for menu in items if menu["components"]]
+ logger.debug(
+ f'Filtered components {"with errors" if with_errors else ""}: {filtered}'
+ )
+ return {"menu": filtered}
+
+ def validate_code(self, file_content):
+ """
+ Validate the Python code by trying to parse it with ast.parse.
+ """
+ try:
+ ast.parse(file_content)
+ return True
+ except SyntaxError:
+ return False
+
+ def validate_build(self, file_content):
+ """
+ Check if the file content contains a function named 'build'.
+ """
+ return "def build" in file_content
+
+ def read_file_content(self, file_path):
+ """
+ Read and return the content of a file.
+ """
+ if not os.path.isfile(file_path):
+ return None
+ with open(file_path, "r") as file:
+ return file.read()
+
+ def get_files(self):
+ """
+ Walk through the directory path and return a list of all .py files.
+ """
+ if not (safe_path := self.get_safe_path()):
+ raise CustomComponentPathValueError(
+ f"The path needs to start with '{self.base_path}'."
+ )
+
+ file_list = []
+ for root, _, files in os.walk(safe_path):
+ file_list.extend(
+ os.path.join(root, filename)
+ for filename in files
+ if filename.endswith(".py") and not filename.startswith("__")
+ )
+ return file_list
+
+ def find_menu(self, response, menu_name):
+ """
+ Find and return a menu by its name in the response.
+ """
+ return next(
+ (menu for menu in response["menu"] if menu["name"] == menu_name),
+ None,
+ )
+
+ def _is_type_hint_imported(self, type_hint_name: str, code: str) -> bool:
+ """
+ Check if a specific type hint is imported
+ from the typing module in the given code.
+ """
+ module = ast.parse(code)
+
+ return any(
+ isinstance(node, ast.ImportFrom)
+ and node.module == "typing"
+ and any(alias.name == type_hint_name for alias in node.names)
+ for node in ast.walk(module)
+ )
+
+ def _is_type_hint_used_in_args(self, type_hint_name: str, code: str) -> bool:
+ """
+ Check if a specific type hint is used in the
+ function definitions within the given code.
+ """
+ try:
+ module = ast.parse(code)
+
+ for node in ast.walk(module):
+ if isinstance(node, ast.FunctionDef):
+ for arg in node.args.args:
+ if self._is_type_hint_in_arg_annotation(
+ arg.annotation, type_hint_name
+ ):
+ return True
+ except SyntaxError:
+ # Returns False if the code is not valid Python
+ return False
+ return False
+
+ def _is_type_hint_in_arg_annotation(self, annotation, type_hint_name: str) -> bool:
+ """
+ Helper function to check if a type hint exists in an annotation.
+ """
+ return (
+ annotation is not None
+ and isinstance(annotation, ast.Subscript)
+ and isinstance(annotation.value, ast.Name)
+ and annotation.value.id == type_hint_name
+ )
+
+ def is_type_hint_used_but_not_imported(
+ self, type_hint_name: str, code: str
+ ) -> bool:
+ """
+ Check if a type hint is used but not imported in the given code.
+ """
+ try:
+ return self._is_type_hint_used_in_args(
+ type_hint_name, code
+ ) and not self._is_type_hint_imported(type_hint_name, code)
+ except SyntaxError:
+ # Returns True if there's something wrong with the code
+ # TODO : Find a better way to handle this
+ return True
+
+ def process_file(self, file_path):
+ """
+ Process a file by validating its content and
+ returning the result and content/error message.
+ """
+ file_content = self.read_file_content(file_path)
+
+ if file_content is None:
+ return False, f"Could not read {file_path}"
+ elif self.is_empty_file(file_content):
+ return False, "Empty file"
+ elif not self.validate_code(file_content):
+ return False, "Syntax error"
+ elif not self.validate_build(file_content):
+ return False, "Missing build function"
+ elif self._is_type_hint_used_in_args(
+ "Optional", file_content
+ ) and not self._is_type_hint_imported("Optional", file_content):
+ return (
+ False,
+ "Type hint 'Optional' is used but not imported in the code.",
+ )
+ else:
+ if self.compress_code_field:
+ file_content = str(StringCompressor(file_content).compress_string())
+ return True, file_content
+
+ def build_component_menu_list(self, file_paths):
+ """
+ Build a list of menus with their components
+ from the .py files in the directory.
+ """
+ response = {"menu": []}
+ logger.debug(
+ "-------------------- Building component menu list --------------------"
+ )
+
+ for file_path in file_paths:
+ menu_name = os.path.basename(os.path.dirname(file_path))
+ logger.debug(f"Menu name: {menu_name}")
+ filename = os.path.basename(file_path)
+ validation_result, result_content = self.process_file(file_path)
+ logger.debug(f"Validation result: {validation_result}")
+
+ menu_result = self.find_menu(response, menu_name) or {
+ "name": menu_name,
+ "path": os.path.dirname(file_path),
+ "components": [],
+ }
+ component_name = filename.split(".")[0]
+ # This is the name of the file which will be displayed in the UI
+ # We need to change it from snake_case to CamelCase
+
+ # first check if it's already CamelCase
+ if "_" in component_name:
+ component_name_camelcase = " ".join(
+ word.title() for word in component_name.split("_")
+ )
+ else:
+ component_name_camelcase = component_name
+
+ component_info = {
+ "name": "CustomComponent",
+ "output_types": [component_name_camelcase],
+ "file": filename,
+ "code": result_content if validation_result else "",
+ "error": "" if validation_result else result_content,
+ }
+ menu_result["components"].append(component_info)
+
+ logger.debug(f"Component info: {component_info}")
+ if menu_result not in response["menu"]:
+ response["menu"].append(menu_result)
+ logger.debug(
+ "-------------------- Component menu list built --------------------"
+ )
+ return response
diff --git a/src/backend/langflow/interface/custom/schema.py b/src/backend/langflow/interface/custom/schema.py
new file mode 100644
index 000000000..80d65405f
--- /dev/null
+++ b/src/backend/langflow/interface/custom/schema.py
@@ -0,0 +1,29 @@
+from pydantic import BaseModel, Field
+
+
+from typing import Optional
+
+
+class ClassCodeDetails(BaseModel):
+ """
+ A dataclass for storing details about a class.
+ """
+
+ name: str
+ doc: Optional[str]
+ bases: list
+ attributes: list
+ methods: list
+ init: Optional[dict] = Field(default_factory=dict)
+
+
+class CallableCodeDetails(BaseModel):
+ """
+ A dataclass for storing details about a callable.
+ """
+
+ name: str
+ doc: Optional[str]
+ args: list
+ body: list
+ return_type: Optional[str]
diff --git a/src/backend/langflow/interface/document_loaders/base.py b/src/backend/langflow/interface/document_loaders/base.py
index 5219fbd13..db0832ff3 100644
--- a/src/backend/langflow/interface/document_loaders/base.py
+++ b/src/backend/langflow/interface/document_loaders/base.py
@@ -1,9 +1,10 @@
from typing import Dict, List, Optional, Type
from langflow.interface.base import LangChainTypeCreator
+from langflow.services.utils import get_settings_manager
from langflow.template.frontend_node.documentloaders import DocumentLoaderFrontNode
from langflow.interface.custom_lists import documentloaders_type_to_cls_dict
-from langflow.settings import settings
+
from langflow.utils.logger import logger
from langflow.utils.util import build_template_from_class
@@ -30,10 +31,12 @@ class DocumentLoaderCreator(LangChainTypeCreator):
return None
def to_list(self) -> List[str]:
+ settings_manager = get_settings_manager()
return [
documentloader.__name__
for documentloader in self.type_to_loader_dict.values()
- if documentloader.__name__ in settings.documentloaders or settings.dev
+ if documentloader.__name__ in settings_manager.settings.DOCUMENTLOADERS
+ or settings_manager.settings.DEV
]
diff --git a/src/backend/langflow/interface/embeddings/base.py b/src/backend/langflow/interface/embeddings/base.py
index 1dfa05a99..169985d37 100644
--- a/src/backend/langflow/interface/embeddings/base.py
+++ b/src/backend/langflow/interface/embeddings/base.py
@@ -2,7 +2,8 @@ from typing import Dict, List, Optional, Type
from langflow.interface.base import LangChainTypeCreator
from langflow.interface.custom_lists import embedding_type_to_cls_dict
-from langflow.settings import settings
+from langflow.services.utils import get_settings_manager
+
from langflow.template.frontend_node.base import FrontendNode
from langflow.template.frontend_node.embeddings import EmbeddingFrontendNode
from langflow.utils.logger import logger
@@ -32,10 +33,12 @@ class EmbeddingCreator(LangChainTypeCreator):
return None
def to_list(self) -> List[str]:
+ settings_manager = get_settings_manager()
return [
embedding.__name__
for embedding in self.type_to_loader_dict.values()
- if embedding.__name__ in settings.embeddings or settings.dev
+ if embedding.__name__ in settings_manager.settings.EMBEDDINGS
+ or settings_manager.settings.DEV
]
diff --git a/src/backend/langflow/interface/importing/utils.py b/src/backend/langflow/interface/importing/utils.py
index ccfd8d5dd..d07222dd1 100644
--- a/src/backend/langflow/interface/importing/utils.py
+++ b/src/backend/langflow/interface/importing/utils.py
@@ -9,6 +9,7 @@ from langchain.base_language import BaseLanguageModel
from langchain.chains.base import Chain
from langchain.chat_models.base import BaseChatModel
from langchain.tools import BaseTool
+from langflow.interface.custom.custom_component import CustomComponent
from langflow.utils import validate
from langflow.interface.wrappers.base import wrapper_creator
@@ -47,6 +48,7 @@ def import_by_type(_type: str, name: str) -> Any:
"utilities": import_utility,
"output_parsers": import_output_parser,
"retrievers": import_retriever,
+ "custom_components": import_custom_component,
}
if _type == "llms":
key = "chat" if "chat" in name.lower() else "llm"
@@ -57,6 +59,11 @@ def import_by_type(_type: str, name: str) -> Any:
return loaded_func(name)
+def import_custom_component(custom_component: str) -> CustomComponent:
+ """Import custom component from custom component name"""
+ return import_class("langflow.interface.custom.custom_component.CustomComponent")
+
+
def import_output_parser(output_parser: str) -> Any:
"""Import output parser from output parser name"""
return import_module(f"from langchain.output_parsers import {output_parser}")
@@ -172,3 +179,8 @@ def get_function(code):
function_name = validate.extract_function_name(code)
return validate.create_function(code, function_name)
+
+
+def get_function_custom(code):
+ class_name = validate.extract_class_name(code)
+ return validate.create_class(code, class_name)
diff --git a/src/backend/langflow/interface/initialize/loading.py b/src/backend/langflow/interface/initialize/loading.py
index b232d089c..c315a9577 100644
--- a/src/backend/langflow/interface/initialize/loading.py
+++ b/src/backend/langflow/interface/initialize/loading.py
@@ -1,21 +1,27 @@
-import contextlib
import json
-from typing import Any, Callable, Dict, List, Sequence, Type
+from typing import Any, Callable, Dict, Sequence, Type
-from langchain.agents import ZeroShotAgent
from langchain.agents import agent as agent_module
from langchain.agents.agent import AgentExecutor
from langchain.agents.agent_toolkits.base import BaseToolkit
from langchain.agents.tools import BaseTool
from langflow.interface.initialize.llm import initialize_vertexai
+from langflow.interface.initialize.utils import (
+ handle_format_kwargs,
+ handle_node_type,
+ handle_partial_variables,
+)
from langflow.interface.initialize.vector_store import vecstore_initializer
-from langchain.schema import Document, BaseOutputParser
from pydantic import ValidationError
+from langflow.interface.importing.utils import (
+ get_function,
+ get_function_custom,
+ import_by_type,
+)
from langflow.interface.custom_lists import CUSTOM_NODES
-from langflow.interface.importing.utils import get_function, import_by_type
from langflow.interface.agents.base import agent_creator
from langflow.interface.toolkits.base import toolkits_creator
from langflow.interface.chains.base import chain_creator
@@ -27,6 +33,7 @@ from langflow.utils import validate
from langchain.chains.base import Chain
from langchain.vectorstores.base import VectorStore
from langchain.document_loaders.base import BaseLoader
+from langflow.utils.logger import logger
def instantiate_class(node_type: str, base_type: str, params: Dict) -> Any:
@@ -38,7 +45,7 @@ def instantiate_class(node_type: str, base_type: str, params: Dict) -> Any:
if hasattr(custom_node, "initialize"):
return custom_node.initialize(**params)
return custom_node(**params)
-
+ logger.debug(f"Instantiating {node_type} of type {base_type}")
class_object = import_by_type(_type=base_type, name=node_type)
return instantiate_based_on_type(class_object, base_type, node_type, params)
@@ -58,7 +65,12 @@ def convert_kwargs(params):
kwargs_keys = [key for key in params.keys() if "kwargs" in key or "config" in key]
for key in kwargs_keys:
if isinstance(params[key], str):
- params[key] = json.loads(params[key])
+ try:
+ params[key] = json.loads(params[key])
+ except json.JSONDecodeError:
+ # if the string is not a valid json string, we will
+ # remove the key from the params
+ params.pop(key, None)
return params
@@ -76,7 +88,7 @@ def instantiate_based_on_type(class_object, base_type, node_type, params):
elif base_type == "toolkits":
return instantiate_toolkit(node_type, class_object, params)
elif base_type == "embeddings":
- return instantiate_embedding(class_object, params)
+ return instantiate_embedding(node_type, class_object, params)
elif base_type == "vectorstores":
return instantiate_vectorstore(class_object, params)
elif base_type == "documentloaders":
@@ -95,12 +107,24 @@ def instantiate_based_on_type(class_object, base_type, node_type, params):
return instantiate_retriever(node_type, class_object, params)
elif base_type == "memory":
return instantiate_memory(node_type, class_object, params)
+ elif base_type == "custom_components":
+ return instantiate_custom_component(node_type, class_object, params)
elif base_type == "wrappers":
return instantiate_wrapper(node_type, class_object, params)
else:
return class_object(**params)
+def instantiate_custom_component(node_type, class_object, params):
+ # we need to make a copy of the params because we will be
+ # modifying it
+ params_copy = params.copy()
+ class_object = get_function_custom(params_copy.pop("code"))
+ custom_component = class_object()
+ built_object = custom_component.build(**params_copy)
+ return built_object, {"repr": custom_component.custom_repr()}
+
+
def instantiate_wrapper(node_type, class_object, params):
if node_type in wrapper_creator.from_method_nodes:
method = wrapper_creator.from_method_nodes[node_type]
@@ -123,7 +147,7 @@ def instantiate_llm(node_type, class_object, params: Dict):
# This is a workaround so JinaChat works until streaming is implemented
# if "openai_api_base" in params and "jina" in params["openai_api_base"]:
# False if condition is True
- if node_type == "VertexAI":
+ if "VertexAI" in node_type:
return initialize_vertexai(class_object=class_object, params=params)
# max_tokens sometimes is a string and should be an int
if "max_tokens" in params:
@@ -199,68 +223,11 @@ def instantiate_agent(node_type, class_object: Type[agent_module.Agent], params:
def instantiate_prompt(node_type, class_object, params: Dict):
- if node_type == "ZeroShotPrompt":
- if "tools" not in params:
- params["tools"] = []
- return ZeroShotAgent.create_prompt(**params)
- elif "MessagePromptTemplate" in node_type:
- # Then we only need the template
- from_template_params = {
- "template": params.pop("prompt", params.pop("template", ""))
- }
-
- if not from_template_params.get("template"):
- raise ValueError("Prompt template is required")
- prompt = class_object.from_template(**from_template_params)
-
- elif node_type == "ChatPromptTemplate":
- prompt = class_object.from_messages(**params)
- else:
- prompt = class_object(**params)
-
- format_kwargs: Dict[str, Any] = {}
- for input_variable in prompt.input_variables:
- if input_variable in params:
- variable = params[input_variable]
- if isinstance(variable, str):
- format_kwargs[input_variable] = variable
- elif isinstance(variable, BaseOutputParser) and hasattr(
- variable, "get_format_instructions"
- ):
- format_kwargs[input_variable] = variable.get_format_instructions()
- elif isinstance(variable, List) and all(
- isinstance(item, Document) for item in variable
- ):
- # Format document to contain page_content and metadata
- # as one string separated by a newline
- if len(variable) > 1:
- content = "\n".join(
- [item.page_content for item in variable if item.page_content]
- )
- else:
- content = variable[0].page_content
- # content could be a json list of strings
- with contextlib.suppress(json.JSONDecodeError):
- content = json.loads(content)
- if isinstance(content, list):
- content = ",".join([str(item) for item in content])
- format_kwargs[input_variable] = content
- # handle_keys will be a list but it does not exist yet
- # so we need to create it
-
- if (
- isinstance(variable, List)
- and all(isinstance(item, Document) for item in variable)
- ) or (
- isinstance(variable, BaseOutputParser)
- and hasattr(variable, "get_format_instructions")
- ):
- if "handle_keys" not in format_kwargs:
- format_kwargs["handle_keys"] = []
-
- # Add the handle_keys to the list
- format_kwargs["handle_keys"].append(input_variable)
-
+ params, prompt = handle_node_type(node_type, class_object, params)
+ format_kwargs = handle_format_kwargs(prompt, params)
+ # Now we'll use partial_format to format the prompt
+ if format_kwargs:
+ prompt = handle_partial_variables(prompt, format_kwargs)
return prompt, format_kwargs
@@ -294,9 +261,13 @@ def instantiate_toolkit(node_type, class_object: Type[BaseToolkit], params: Dict
return loaded_toolkit
-def instantiate_embedding(class_object, params: Dict):
+def instantiate_embedding(node_type, class_object, params: Dict):
params.pop("model", None)
params.pop("headers", None)
+
+ if "VertexAI" in node_type:
+ return initialize_vertexai(class_object=class_object, params=params)
+
try:
return class_object(**params)
except ValidationError:
@@ -363,6 +334,8 @@ def instantiate_textsplitter(
):
try:
documents = params.pop("documents")
+ if not isinstance(documents, list):
+ documents = [documents]
except KeyError as exc:
raise ValueError(
"The source you provided did not load correctly or was empty."
diff --git a/src/backend/langflow/interface/initialize/utils.py b/src/backend/langflow/interface/initialize/utils.py
new file mode 100644
index 000000000..976d8906c
--- /dev/null
+++ b/src/backend/langflow/interface/initialize/utils.py
@@ -0,0 +1,113 @@
+import contextlib
+import json
+from typing import Any, Dict, List
+
+from langchain.agents import ZeroShotAgent
+
+
+from langchain.schema import Document, BaseOutputParser
+
+
+def handle_node_type(node_type, class_object, params: Dict):
+ if node_type == "ZeroShotPrompt":
+ params = check_tools_in_params(params)
+ prompt = ZeroShotAgent.create_prompt(**params)
+ elif "MessagePromptTemplate" in node_type:
+ prompt = instantiate_from_template(class_object, params)
+ elif node_type == "ChatPromptTemplate":
+ prompt = class_object.from_messages(**params)
+ else:
+ prompt = class_object(**params)
+ return params, prompt
+
+
+def check_tools_in_params(params: Dict):
+ if "tools" not in params:
+ params["tools"] = []
+ return params
+
+
+def instantiate_from_template(class_object, params: Dict):
+ from_template_params = {
+ "template": params.pop("prompt", params.pop("template", ""))
+ }
+ if not from_template_params.get("template"):
+ raise ValueError("Prompt template is required")
+ return class_object.from_template(**from_template_params)
+
+
+def handle_format_kwargs(prompt, params: Dict):
+ format_kwargs: Dict[str, Any] = {}
+ for input_variable in prompt.input_variables:
+ if input_variable in params:
+ format_kwargs = handle_variable(params, input_variable, format_kwargs)
+ return format_kwargs
+
+
+def handle_partial_variables(prompt, format_kwargs: Dict):
+ partial_variables = format_kwargs.copy()
+ partial_variables = {
+ key: value for key, value in partial_variables.items() if value
+ }
+ # Remove handle_keys otherwise LangChain raises an error
+ partial_variables.pop("handle_keys", None)
+ return prompt.partial(**partial_variables)
+
+
+def handle_variable(params: Dict, input_variable: str, format_kwargs: Dict):
+ variable = params[input_variable]
+ if isinstance(variable, str):
+ format_kwargs[input_variable] = variable
+ elif isinstance(variable, BaseOutputParser) and hasattr(
+ variable, "get_format_instructions"
+ ):
+ format_kwargs[input_variable] = variable.get_format_instructions()
+ elif is_instance_of_list_or_document(variable):
+ format_kwargs = format_document(variable, input_variable, format_kwargs)
+ if needs_handle_keys(variable):
+ format_kwargs = add_handle_keys(input_variable, format_kwargs)
+ return format_kwargs
+
+
+def is_instance_of_list_or_document(variable):
+ return (
+ isinstance(variable, List)
+ and all(isinstance(item, Document) for item in variable)
+ or isinstance(variable, Document)
+ )
+
+
+def format_document(variable, input_variable: str, format_kwargs: Dict):
+ variable = variable if isinstance(variable, List) else [variable]
+ content = format_content(variable)
+ format_kwargs[input_variable] = content
+ return format_kwargs
+
+
+def format_content(variable):
+ if len(variable) > 1:
+ return "\n".join([item.page_content for item in variable if item.page_content])
+ content = variable[0].page_content
+ return try_to_load_json(content)
+
+
+def try_to_load_json(content):
+ with contextlib.suppress(json.JSONDecodeError):
+ content = json.loads(content)
+ if isinstance(content, list):
+ content = ",".join([str(item) for item in content])
+ return content
+
+
+def needs_handle_keys(variable):
+ return is_instance_of_list_or_document(variable) or (
+ isinstance(variable, BaseOutputParser)
+ and hasattr(variable, "get_format_instructions")
+ )
+
+
+def add_handle_keys(input_variable: str, format_kwargs: Dict):
+ if "handle_keys" not in format_kwargs:
+ format_kwargs["handle_keys"] = []
+ format_kwargs["handle_keys"].append(input_variable)
+ return format_kwargs
diff --git a/src/backend/langflow/interface/initialize/vector_store.py b/src/backend/langflow/interface/initialize/vector_store.py
index d4bdb0155..1bc2d73e0 100644
--- a/src/backend/langflow/interface/initialize/vector_store.py
+++ b/src/backend/langflow/interface/initialize/vector_store.py
@@ -130,8 +130,8 @@ def initialize_pinecone(class_object: Type[Pinecone], params: dict):
import pinecone # type: ignore
- pinecone_api_key = params.get("pinecone_api_key")
- pinecone_env = params.get("pinecone_env")
+ pinecone_api_key = params.pop("pinecone_api_key")
+ pinecone_env = params.pop("pinecone_env")
if pinecone_api_key is None or pinecone_env is None:
if os.getenv("PINECONE_API_KEY") is not None:
@@ -170,6 +170,26 @@ def initialize_pinecone(class_object: Type[Pinecone], params: dict):
def initialize_chroma(class_object: Type[Chroma], params: dict):
"""Initialize a ChromaDB object from the params"""
+ if ( # type: ignore
+ "chroma_server_host" in params or "chroma_server_http_port" in params
+ ):
+ import chromadb # type: ignore
+
+ settings_params = {
+ key: params[key]
+ for key, value_ in params.items()
+ if key.startswith("chroma_server_") and value_
+ }
+ chroma_settings = chromadb.config.Settings(**settings_params)
+ params["client_settings"] = chroma_settings
+ else:
+ # remove all chroma_server_ keys from params
+ params = {
+ key: value
+ for key, value in params.items()
+ if not key.startswith("chroma_server_")
+ }
+
persist = params.pop("persist", False)
if not docs_in_params(params):
params.pop("documents", None)
diff --git a/src/backend/langflow/interface/listing.py b/src/backend/langflow/interface/listing.py
index 0893f855a..1cab1efbc 100644
--- a/src/backend/langflow/interface/listing.py
+++ b/src/backend/langflow/interface/listing.py
@@ -13,33 +13,44 @@ from langflow.interface.vector_store.base import vectorstore_creator
from langflow.interface.wrappers.base import wrapper_creator
from langflow.interface.output_parsers.base import output_parser_creator
from langflow.interface.retrievers.base import retriever_creator
+from langflow.interface.custom.base import custom_component_creator
+from langflow.utils.lazy_load import LazyLoadDictBase
-def get_type_dict():
- return {
- "agents": agent_creator.to_list(),
- "prompts": prompt_creator.to_list(),
- "llms": llm_creator.to_list(),
- "tools": tool_creator.to_list(),
- "chains": chain_creator.to_list(),
- "memory": memory_creator.to_list(),
- "toolkits": toolkits_creator.to_list(),
- "wrappers": wrapper_creator.to_list(),
- "documentLoaders": documentloader_creator.to_list(),
- "vectorStore": vectorstore_creator.to_list(),
- "embeddings": embedding_creator.to_list(),
- "textSplitters": textsplitter_creator.to_list(),
- "utilities": utility_creator.to_list(),
- "outputParsers": output_parser_creator.to_list(),
- "retrievers": retriever_creator.to_list(),
- }
+class AllTypesDict(LazyLoadDictBase):
+ def __init__(self):
+ self._all_types_dict = None
+
+ @property
+ def ALL_TYPES_DICT(self):
+ return self.all_types_dict
+
+ def _build_dict(self):
+ langchain_types_dict = self.get_type_dict()
+ return {
+ **langchain_types_dict,
+ "Custom": ["Custom Tool", "Python Function"],
+ }
+
+ def get_type_dict(self):
+ return {
+ "agents": agent_creator.to_list(),
+ "prompts": prompt_creator.to_list(),
+ "llms": llm_creator.to_list(),
+ "tools": tool_creator.to_list(),
+ "chains": chain_creator.to_list(),
+ "memory": memory_creator.to_list(),
+ "toolkits": toolkits_creator.to_list(),
+ "wrappers": wrapper_creator.to_list(),
+ "documentLoaders": documentloader_creator.to_list(),
+ "vectorStore": vectorstore_creator.to_list(),
+ "embeddings": embedding_creator.to_list(),
+ "textSplitters": textsplitter_creator.to_list(),
+ "utilities": utility_creator.to_list(),
+ "outputParsers": output_parser_creator.to_list(),
+ "retrievers": retriever_creator.to_list(),
+ "custom_components": custom_component_creator.to_list(),
+ }
-LANGCHAIN_TYPES_DICT = get_type_dict()
-
-# Now we'll build a dict with Langchain types and ours
-
-ALL_TYPES_DICT = {
- **LANGCHAIN_TYPES_DICT,
- "Custom": ["Custom Tool", "Python Function"],
-}
+lazy_load_dict = AllTypesDict()
diff --git a/src/backend/langflow/interface/llms/base.py b/src/backend/langflow/interface/llms/base.py
index 66e153880..f562b99ed 100644
--- a/src/backend/langflow/interface/llms/base.py
+++ b/src/backend/langflow/interface/llms/base.py
@@ -2,7 +2,8 @@ from typing import Dict, List, Optional, Type
from langflow.interface.base import LangChainTypeCreator
from langflow.interface.custom_lists import llm_type_to_cls_dict
-from langflow.settings import settings
+from langflow.services.utils import get_settings_manager
+
from langflow.template.frontend_node.llms import LLMFrontendNode
from langflow.utils.logger import logger
from langflow.utils.util import build_template_from_class
@@ -33,10 +34,12 @@ class LLMCreator(LangChainTypeCreator):
return None
def to_list(self) -> List[str]:
+ settings_manager = get_settings_manager()
return [
llm.__name__
for llm in self.type_to_loader_dict.values()
- if llm.__name__ in settings.llms or settings.dev
+ if llm.__name__ in settings_manager.settings.LLMS
+ or settings_manager.settings.DEV
]
diff --git a/src/backend/langflow/interface/memories/base.py b/src/backend/langflow/interface/memories/base.py
index 0f97a02fe..70665602c 100644
--- a/src/backend/langflow/interface/memories/base.py
+++ b/src/backend/langflow/interface/memories/base.py
@@ -2,7 +2,8 @@ from typing import Dict, List, Optional, Type
from langflow.interface.base import LangChainTypeCreator
from langflow.interface.custom_lists import memory_type_to_cls_dict
-from langflow.settings import settings
+from langflow.services.utils import get_settings_manager
+
from langflow.template.frontend_node.base import FrontendNode
from langflow.template.frontend_node.memories import MemoryFrontendNode
from langflow.utils.logger import logger
@@ -48,10 +49,12 @@ class MemoryCreator(LangChainTypeCreator):
return None
def to_list(self) -> List[str]:
+ settings_manager = get_settings_manager()
return [
memory.__name__
for memory in self.type_to_loader_dict.values()
- if memory.__name__ in settings.memories or settings.dev
+ if memory.__name__ in settings_manager.settings.MEMORIES
+ or settings_manager.settings.DEV
]
diff --git a/src/backend/langflow/interface/output_parsers/base.py b/src/backend/langflow/interface/output_parsers/base.py
index 79cbdd98c..256b521e1 100644
--- a/src/backend/langflow/interface/output_parsers/base.py
+++ b/src/backend/langflow/interface/output_parsers/base.py
@@ -4,7 +4,8 @@ from langchain import output_parsers
from langflow.interface.base import LangChainTypeCreator
from langflow.interface.importing.utils import import_class
-from langflow.settings import settings
+from langflow.services.utils import get_settings_manager
+
from langflow.template.frontend_node.output_parsers import OutputParserFrontendNode
from langflow.utils.logger import logger
from langflow.utils.util import build_template_from_class, build_template_from_method
@@ -23,6 +24,7 @@ class OutputParserCreator(LangChainTypeCreator):
@property
def type_to_loader_dict(self) -> Dict:
if self.type_dict is None:
+ settings_manager = get_settings_manager()
self.type_dict = {
output_parser_name: import_class(
f"langchain.output_parsers.{output_parser_name}"
@@ -33,7 +35,8 @@ class OutputParserCreator(LangChainTypeCreator):
self.type_dict = {
name: output_parser
for name, output_parser in self.type_dict.items()
- if name in settings.output_parsers or settings.dev
+ if name in settings_manager.settings.OUTPUT_PARSERS
+ or settings_manager.settings.DEV
}
return self.type_dict
diff --git a/src/backend/langflow/interface/prompts/base.py b/src/backend/langflow/interface/prompts/base.py
index 39bd94c5b..5aa41dfb2 100644
--- a/src/backend/langflow/interface/prompts/base.py
+++ b/src/backend/langflow/interface/prompts/base.py
@@ -5,7 +5,8 @@ from langchain import prompts
from langflow.custom.customs import get_custom_nodes
from langflow.interface.base import LangChainTypeCreator
from langflow.interface.importing.utils import import_class
-from langflow.settings import settings
+from langflow.services.utils import get_settings_manager
+
from langflow.template.frontend_node.prompts import PromptFrontendNode
from langflow.utils.logger import logger
from langflow.utils.util import build_template_from_class
@@ -20,6 +21,7 @@ class PromptCreator(LangChainTypeCreator):
@property
def type_to_loader_dict(self) -> Dict:
+ settings_manager = get_settings_manager()
if self.type_dict is None:
self.type_dict = {
prompt_name: import_class(f"langchain.prompts.{prompt_name}")
@@ -34,7 +36,8 @@ class PromptCreator(LangChainTypeCreator):
self.type_dict = {
name: prompt
for name, prompt in self.type_dict.items()
- if name in settings.prompts or settings.dev
+ if name in settings_manager.settings.PROMPTS
+ or settings_manager.settings.DEV
}
return self.type_dict
diff --git a/src/backend/langflow/interface/retrievers/base.py b/src/backend/langflow/interface/retrievers/base.py
index dc6056656..db1cfd165 100644
--- a/src/backend/langflow/interface/retrievers/base.py
+++ b/src/backend/langflow/interface/retrievers/base.py
@@ -4,7 +4,8 @@ from langchain import retrievers
from langflow.interface.base import LangChainTypeCreator
from langflow.interface.importing.utils import import_class
-from langflow.settings import settings
+from langflow.services.utils import get_settings_manager
+
from langflow.template.frontend_node.retrievers import RetrieverFrontendNode
from langflow.utils.logger import logger
from langflow.utils.util import build_template_from_method, build_template_from_class
@@ -48,10 +49,12 @@ class RetrieverCreator(LangChainTypeCreator):
return None
def to_list(self) -> List[str]:
+ settings_manager = get_settings_manager()
return [
retriever
for retriever in self.type_to_loader_dict.keys()
- if retriever in settings.retrievers or settings.dev
+ if retriever in settings_manager.settings.RETRIEVERS
+ or settings_manager.settings.DEV
]
diff --git a/src/backend/langflow/interface/run.py b/src/backend/langflow/interface/run.py
index 97f47334e..cb0573bf7 100644
--- a/src/backend/langflow/interface/run.py
+++ b/src/backend/langflow/interface/run.py
@@ -1,4 +1,4 @@
-from langflow.cache.utils import memoize_dict
+from langflow.services.cache.utils import memoize_dict
from langflow.graph import Graph
from langflow.utils.logger import logger
diff --git a/src/backend/langflow/interface/text_splitters/base.py b/src/backend/langflow/interface/text_splitters/base.py
index 203f30086..87b778c4c 100644
--- a/src/backend/langflow/interface/text_splitters/base.py
+++ b/src/backend/langflow/interface/text_splitters/base.py
@@ -1,9 +1,10 @@
from typing import Dict, List, Optional, Type
from langflow.interface.base import LangChainTypeCreator
+from langflow.services.utils import get_settings_manager
from langflow.template.frontend_node.textsplitters import TextSplittersFrontendNode
from langflow.interface.custom_lists import textsplitter_type_to_cls_dict
-from langflow.settings import settings
+
from langflow.utils.logger import logger
from langflow.utils.util import build_template_from_class
@@ -30,10 +31,12 @@ class TextSplitterCreator(LangChainTypeCreator):
return None
def to_list(self) -> List[str]:
+ settings_manager = get_settings_manager()
return [
textsplitter.__name__
for textsplitter in self.type_to_loader_dict.values()
- if textsplitter.__name__ in settings.textsplitters or settings.dev
+ if textsplitter.__name__ in settings_manager.settings.TEXTSPLITTERS
+ or settings_manager.settings.DEV
]
diff --git a/src/backend/langflow/interface/toolkits/base.py b/src/backend/langflow/interface/toolkits/base.py
index be2345c02..c13ffdbd9 100644
--- a/src/backend/langflow/interface/toolkits/base.py
+++ b/src/backend/langflow/interface/toolkits/base.py
@@ -4,7 +4,8 @@ from langchain.agents import agent_toolkits
from langflow.interface.base import LangChainTypeCreator
from langflow.interface.importing.utils import import_class, import_module
-from langflow.settings import settings
+from langflow.services.utils import get_settings_manager
+
from langflow.utils.logger import logger
from langflow.utils.util import build_template_from_class
@@ -29,13 +30,15 @@ class ToolkitCreator(LangChainTypeCreator):
@property
def type_to_loader_dict(self) -> Dict:
if self.type_dict is None:
+ settings_manager = get_settings_manager()
self.type_dict = {
toolkit_name: import_class(
f"langchain.agents.agent_toolkits.{toolkit_name}"
)
# if toolkit_name is not lower case it is a class
for toolkit_name in agent_toolkits.__all__
- if not toolkit_name.islower() and toolkit_name in settings.toolkits
+ if not toolkit_name.islower()
+ and toolkit_name in settings_manager.settings.TOOLKITS
}
return self.type_dict
diff --git a/src/backend/langflow/interface/tools/base.py b/src/backend/langflow/interface/tools/base.py
index 027224a3a..1dbc9a6ed 100644
--- a/src/backend/langflow/interface/tools/base.py
+++ b/src/backend/langflow/interface/tools/base.py
@@ -15,7 +15,8 @@ from langflow.interface.tools.constants import (
OTHER_TOOLS,
)
from langflow.interface.tools.util import get_tool_params
-from langflow.settings import settings
+from langflow.services.utils import get_settings_manager
+
from langflow.template.field.base import TemplateField
from langflow.template.template.base import Template
from langflow.utils import util
@@ -55,7 +56,7 @@ TOOL_INPUTS = {
show=True,
value="",
suffixes=[".json", ".yaml", ".yml"],
- fileTypes=["json", "yaml", "yml"],
+ file_types=["json", "yaml", "yml"],
),
}
@@ -66,6 +67,7 @@ class ToolCreator(LangChainTypeCreator):
@property
def type_to_loader_dict(self) -> Dict:
+ settings_manager = get_settings_manager()
if self.tools_dict is None:
all_tools = {}
@@ -74,7 +76,10 @@ class ToolCreator(LangChainTypeCreator):
tool_name = tool_params.get("name") or tool
- if tool_name in settings.tools or settings.dev:
+ if (
+ tool_name in settings_manager.settings.TOOLS
+ or settings_manager.settings.DEV
+ ):
if tool_name == "JsonSpec":
tool_params["path"] = tool_params.pop("dict_") # type: ignore
all_tools[tool_name] = {
diff --git a/src/backend/langflow/interface/tools/constants.py b/src/backend/langflow/interface/tools/constants.py
index fea3c5237..dc1bfe0c1 100644
--- a/src/backend/langflow/interface/tools/constants.py
+++ b/src/backend/langflow/interface/tools/constants.py
@@ -9,7 +9,10 @@ from langchain.agents.load_tools import (
from langchain.tools.json.tool import JsonSpec
from langflow.interface.importing.utils import import_class
-from langflow.interface.tools.custom import PythonFunctionTool, PythonFunction
+from langflow.interface.tools.custom import (
+ PythonFunctionTool,
+ PythonFunction,
+)
FILE_TOOLS = {"JsonSpec": JsonSpec}
CUSTOM_TOOLS = {
diff --git a/src/backend/langflow/interface/tools/custom.py b/src/backend/langflow/interface/tools/custom.py
index 0e2e5ff57..321298e34 100644
--- a/src/backend/langflow/interface/tools/custom.py
+++ b/src/backend/langflow/interface/tools/custom.py
@@ -34,8 +34,6 @@ class Function(BaseModel):
class PythonFunctionTool(Function, Tool):
- """Python function"""
-
name: str = "Custom Tool"
description: str
code: str
@@ -49,6 +47,4 @@ class PythonFunctionTool(Function, Tool):
class PythonFunction(Function):
- """Python function"""
-
code: str
diff --git a/src/backend/langflow/interface/types.py b/src/backend/langflow/interface/types.py
index 72ecb6775..885e33694 100644
--- a/src/backend/langflow/interface/types.py
+++ b/src/backend/langflow/interface/types.py
@@ -1,7 +1,13 @@
+import ast
+import contextlib
+from typing import Any, List
+from langflow.api.utils import merge_nested_dicts_with_renaming
from langflow.interface.agents.base import agent_creator
from langflow.interface.chains.base import chain_creator
+from langflow.interface.custom.constants import CUSTOM_COMPONENT_SUPPORTED_TYPES
from langflow.interface.document_loaders.base import documentloader_creator
from langflow.interface.embeddings.base import embedding_creator
+from langflow.interface.importing.utils import get_function_custom
from langflow.interface.llms.base import llm_creator
from langflow.interface.memories.base import memory_creator
from langflow.interface.prompts.base import prompt_creator
@@ -12,9 +18,27 @@ from langflow.interface.utilities.base import utility_creator
from langflow.interface.vector_store.base import vectorstore_creator
from langflow.interface.wrappers.base import wrapper_creator
from langflow.interface.output_parsers.base import output_parser_creator
+from langflow.interface.custom.base import custom_component_creator
+from langflow.interface.custom.custom_component import CustomComponent
+
+from langflow.template.field.base import TemplateField
+from langflow.template.frontend_node.constants import CLASSES_TO_REMOVE
+from langflow.template.frontend_node.custom_components import (
+ CustomComponentFrontendNode,
+)
from langflow.interface.retrievers.base import retriever_creator
+from langflow.interface.custom.directory_reader import DirectoryReader
+from langflow.utils.logger import logger
+from langflow.utils.util import get_base_classes
+import re
+import warnings
+import traceback
+from fastapi import HTTPException
+
+
+# Used to get the base_classes list
def get_type_list():
"""Get a list of all langchain types"""
all_types = build_langchain_types_dict()
@@ -29,7 +53,6 @@ def get_type_list():
def build_langchain_types_dict(): # sourcery skip: dict-assign-update-to-union
"""Build a dictionary of all langchain types"""
-
all_types = {}
creators = [
@@ -48,6 +71,7 @@ def build_langchain_types_dict(): # sourcery skip: dict-assign-update-to-union
utility_creator,
output_parser_creator,
retriever_creator,
+ custom_component_creator,
]
all_types = {}
@@ -55,7 +79,352 @@ def build_langchain_types_dict(): # sourcery skip: dict-assign-update-to-union
created_types = creator.to_dict()
if created_types[creator.type_name].values():
all_types.update(created_types)
+
return all_types
-langchain_types_dict = build_langchain_types_dict()
+def process_type(field_type: str):
+ return "prompt" if field_type == "Prompt" else field_type
+
+
+# TODO: Move to correct place
+def add_new_custom_field(
+ template,
+ field_name: str,
+ field_type: str,
+ field_value: Any,
+ field_required: bool,
+ field_config: dict,
+):
+ # Check field_config if any of the keys are in it
+ # if it is, update the value
+ display_name = field_config.pop("display_name", field_name)
+ field_type = field_config.pop("field_type", field_type)
+ field_type = process_type(field_type)
+ field_value = field_config.pop("value", field_value)
+ field_advanced = field_config.pop("advanced", False)
+
+ if field_type == "bool" and field_value is None:
+ field_value = False
+
+ # If options is a list, then it's a dropdown
+ # If options is None, then it's a list of strings
+ is_list = isinstance(field_config.get("options"), list)
+ field_config["is_list"] = is_list or field_config.get("is_list", False)
+
+ if "name" in field_config:
+ warnings.warn(
+ "The 'name' key in field_config is used to build the object and can't be changed."
+ )
+ field_config.pop("name", None)
+
+ required = field_config.pop("required", field_required)
+ placeholder = field_config.pop("placeholder", "")
+
+ new_field = TemplateField(
+ name=field_name,
+ field_type=field_type,
+ value=field_value,
+ show=True,
+ required=required,
+ advanced=field_advanced,
+ placeholder=placeholder,
+ display_name=display_name,
+ **field_config,
+ )
+ template.get("template")[field_name] = new_field.to_dict()
+ template.get("custom_fields")[field_name] = None
+
+ return template
+
+
+# TODO: Move to correct place
+def add_code_field(template, raw_code, field_config):
+ # Field with the Python code to allow update
+
+ code_field = {
+ "code": {
+ "dynamic": True,
+ "required": True,
+ "placeholder": "",
+ "show": field_config.pop("show", True),
+ "multiline": True,
+ "value": raw_code,
+ "password": False,
+ "name": "code",
+ "advanced": field_config.pop("advanced", False),
+ "type": "code",
+ "list": False,
+ }
+ }
+ template.get("template")["code"] = code_field.get("code")
+
+ return template
+
+
+def extract_type_from_optional(field_type):
+ """
+ Extract the type from a string formatted as "Optional[]".
+
+ Parameters:
+ field_type (str): The string from which to extract the type.
+
+ Returns:
+ str: The extracted type, or an empty string if no type was found.
+ """
+ match = re.search(r"\[(.*?)\]", field_type)
+ return match[1] if match else None
+
+
+def build_frontend_node(custom_component: CustomComponent):
+ """Build a frontend node for a custom component"""
+ try:
+ return (
+ CustomComponentFrontendNode().to_dict().get(type(custom_component).__name__)
+ )
+
+ except Exception as exc:
+ logger.error(f"Error while building base frontend node: {exc}")
+ return None
+
+
+def update_attributes(frontend_node, template_config):
+ """Update the display name and description of a frontend node"""
+ if "display_name" in template_config:
+ frontend_node["display_name"] = template_config["display_name"]
+
+ if "description" in template_config:
+ frontend_node["description"] = template_config["description"]
+
+ if "beta" in template_config:
+ frontend_node["beta"] = template_config["beta"]
+
+ if "documentation" in template_config:
+ frontend_node["documentation"] = template_config["documentation"]
+
+
+def build_field_config(custom_component: CustomComponent):
+ """Build the field configuration for a custom component"""
+
+ try:
+ custom_class = get_function_custom(custom_component.code)
+ except Exception as exc:
+ logger.error(f"Error while getting custom function: {str(exc)}")
+ return {}
+
+ try:
+ return custom_class().build_config()
+ except Exception as exc:
+ logger.error(f"Error while building field config: {str(exc)}")
+ return {}
+
+
+def add_extra_fields(frontend_node, field_config, function_args):
+ """Add extra fields to the frontend node"""
+ if function_args is None or function_args == "":
+ return
+
+ # sort function_args which is a list of dicts
+ function_args.sort(key=lambda x: x["name"])
+
+ for extra_field in function_args:
+ if "name" not in extra_field or extra_field["name"] == "self":
+ continue
+
+ field_name, field_type, field_value, field_required = get_field_properties(
+ extra_field
+ )
+ config = field_config.get(field_name, {})
+ frontend_node = add_new_custom_field(
+ frontend_node,
+ field_name,
+ field_type,
+ field_value,
+ field_required,
+ config,
+ )
+
+
+def get_field_properties(extra_field):
+ """Get the properties of an extra field"""
+ field_name = extra_field["name"]
+ field_type = extra_field.get("type", "str")
+ field_value = extra_field.get("default", "")
+ field_required = "optional" not in field_type.lower()
+
+ if not field_required:
+ field_type = extract_type_from_optional(field_type)
+
+ with contextlib.suppress(Exception):
+ field_value = ast.literal_eval(field_value)
+ return field_name, field_type, field_value, field_required
+
+
+def add_base_classes(frontend_node, return_types: List[str]):
+ """Add base classes to the frontend node"""
+ for return_type in return_types:
+ if return_type not in CUSTOM_COMPONENT_SUPPORTED_TYPES or return_type is None:
+ raise HTTPException(
+ status_code=400,
+ detail={
+ "error": (
+ "Invalid return type should be one of: "
+ f"{list(CUSTOM_COMPONENT_SUPPORTED_TYPES.keys())}"
+ ),
+ "traceback": traceback.format_exc(),
+ },
+ )
+
+ return_type_instance = CUSTOM_COMPONENT_SUPPORTED_TYPES.get(return_type)
+ base_classes = get_base_classes(return_type_instance)
+
+ for base_class in base_classes:
+ if base_class not in CLASSES_TO_REMOVE:
+ frontend_node.get("base_classes").append(base_class)
+
+
+def build_langchain_template_custom_component(custom_component: CustomComponent):
+ """Build a custom component template for the langchain"""
+ logger.debug("Building custom component template")
+ frontend_node = build_frontend_node(custom_component)
+
+ if frontend_node is None:
+ return None
+ logger.debug("Built base frontend node")
+ template_config = custom_component.build_template_config
+
+ update_attributes(frontend_node, template_config)
+ logger.debug("Updated attributes")
+ field_config = build_field_config(custom_component)
+ logger.debug("Built field config")
+ add_extra_fields(
+ frontend_node, field_config, custom_component.get_function_entrypoint_args
+ )
+ logger.debug("Added extra fields")
+ frontend_node = add_code_field(
+ frontend_node, custom_component.code, field_config.get("code", {})
+ )
+ logger.debug("Added code field")
+ add_base_classes(
+ frontend_node, custom_component.get_function_entrypoint_return_type
+ )
+ logger.debug("Added base classes")
+ return frontend_node
+
+
+def load_files_from_path(path: str):
+ """Load all files from a given path"""
+ reader = DirectoryReader(path, False)
+
+ return reader.get_files()
+
+
+def build_and_validate_all_files(reader: DirectoryReader, file_list):
+ """Build and validate all files"""
+ data = reader.build_component_menu_list(file_list)
+
+ valid_components = reader.filter_loaded_components(data=data, with_errors=False)
+ invalid_components = reader.filter_loaded_components(data=data, with_errors=True)
+
+ return valid_components, invalid_components
+
+
+def build_valid_menu(valid_components):
+ """Build the valid menu"""
+ valid_menu = {}
+ logger.debug("------------------- VALID COMPONENTS -------------------")
+ for menu_item in valid_components["menu"]:
+ menu_name = menu_item["name"]
+ valid_menu[menu_name] = {}
+
+ for component in menu_item["components"]:
+ logger.debug(f"Building component: {component}")
+ try:
+ component_name = component["name"]
+ component_code = component["code"]
+ component_output_types = component["output_types"]
+
+ component_extractor = CustomComponent(code=component_code)
+ component_extractor.is_check_valid()
+
+ component_template = build_langchain_template_custom_component(
+ component_extractor
+ )
+ component_template["output_types"] = component_output_types
+ if len(component_output_types) == 1:
+ component_name = component_output_types[0]
+ else:
+ file_name = component.get("file").split(".")[0]
+ if "_" in file_name:
+ # turn .py file into camelcase
+ component_name = "".join(
+ [word.capitalize() for word in file_name.split("_")]
+ )
+ else:
+ component_name = file_name
+
+ valid_menu[menu_name][component_name] = component_template
+ logger.debug(f"Added {component_name} to valid menu to {menu_name}")
+
+ except Exception as exc:
+ logger.error(f"Error loading Component: {component['output_types']}")
+ logger.exception(
+ f"Error while building custom component {component_output_types}: {exc}"
+ )
+
+ return valid_menu
+
+
+def build_invalid_menu(invalid_components):
+ """Build the invalid menu"""
+ if invalid_components.get("menu"):
+ logger.debug("------------------- INVALID COMPONENTS -------------------")
+ invalid_menu = {}
+ for menu_item in invalid_components["menu"]:
+ menu_name = menu_item["name"]
+ invalid_menu[menu_name] = {}
+
+ for component in menu_item["components"]:
+ try:
+ component_name = component["name"]
+ component_code = component["code"]
+
+ component_template = (
+ CustomComponentFrontendNode(
+ description="ERROR - Check your Python Code",
+ display_name=f"ERROR - {component_name}",
+ )
+ .to_dict()
+ .get(type(CustomComponent()).__name__)
+ )
+
+ component_template["error"] = component.get("error", None)
+ logger.debug(component)
+ logger.debug(f"Component Path: {component.get('path', None)}")
+ logger.debug(f"Component Error: {component.get('error', None)}")
+ component_template.get("template").get("code")["value"] = component_code
+
+ invalid_menu[menu_name][component_name] = component_template
+ logger.debug(f"Added {component_name} to invalid menu to {menu_name}")
+
+ except Exception as exc:
+ logger.exception(
+ f"Error while creating custom component [{component_name}]: {str(exc)}"
+ )
+
+ return invalid_menu
+
+
+def build_langchain_custom_component_list_from_path(path: str):
+ """Build a list of custom components for the langchain from a given path"""
+ file_list = load_files_from_path(path)
+ reader = DirectoryReader(path, False)
+
+ valid_components, invalid_components = build_and_validate_all_files(
+ reader, file_list
+ )
+
+ valid_menu = build_valid_menu(valid_components)
+ invalid_menu = build_invalid_menu(invalid_components)
+
+ return merge_nested_dicts_with_renaming(valid_menu, invalid_menu)
diff --git a/src/backend/langflow/interface/utilities/base.py b/src/backend/langflow/interface/utilities/base.py
index 6c12b0186..eb8cd60af 100644
--- a/src/backend/langflow/interface/utilities/base.py
+++ b/src/backend/langflow/interface/utilities/base.py
@@ -5,7 +5,8 @@ from langchain import SQLDatabase, utilities
from langflow.custom.customs import get_custom_nodes
from langflow.interface.base import LangChainTypeCreator
from langflow.interface.importing.utils import import_class
-from langflow.settings import settings
+from langflow.services.utils import get_settings_manager
+
from langflow.template.frontend_node.utilities import UtilitiesFrontendNode
from langflow.utils.logger import logger
from langflow.utils.util import build_template_from_class
@@ -26,6 +27,7 @@ class UtilityCreator(LangChainTypeCreator):
from the langchain.chains module and filtering them according to the settings.utilities list.
"""
if self.type_dict is None:
+ settings_manager = get_settings_manager()
self.type_dict = {
utility_name: import_class(f"langchain.utilities.{utility_name}")
for utility_name in utilities.__all__
@@ -35,7 +37,8 @@ class UtilityCreator(LangChainTypeCreator):
self.type_dict = {
name: utility
for name, utility in self.type_dict.items()
- if name in settings.utilities or settings.dev
+ if name in settings_manager.settings.UTILITIES
+ or settings_manager.settings.DEV
}
return self.type_dict
diff --git a/src/backend/langflow/interface/utils.py b/src/backend/langflow/interface/utils.py
index 9203915cf..1fddbf80f 100644
--- a/src/backend/langflow/interface/utils.py
+++ b/src/backend/langflow/interface/utils.py
@@ -9,7 +9,8 @@ import yaml
from langchain.base_language import BaseLanguageModel
from PIL.Image import Image
from langflow.utils.logger import logger
-from langflow.chat.config import ChatConfig
+from langflow.services.chat.config import ChatConfig
+from langflow.services.utils import get_settings_manager
def load_file_into_dict(file_path: str) -> dict:
@@ -63,24 +64,21 @@ def extract_input_variables_from_prompt(prompt: str) -> list[str]:
def setup_llm_caching():
"""Setup LLM caching."""
-
- from langflow.settings import settings
-
+ settings_manager = get_settings_manager()
try:
- set_langchain_cache(settings)
+ set_langchain_cache(settings_manager.settings)
except ImportError:
- logger.warning(f"Could not import {settings.cache}. ")
+ logger.warning(f"Could not import {settings_manager.settings.CACHE}. ")
except Exception as exc:
logger.warning(f"Could not setup LLM caching. Error: {exc}")
-# TODO Rename this here and in `setup_llm_caching`
def set_langchain_cache(settings):
import langchain
from langflow.interface.importing.utils import import_class
cache_type = os.getenv("LANGFLOW_LANGCHAIN_CACHE")
- cache_class = import_class(f"langchain.cache.{cache_type or settings.cache}")
+ cache_class = import_class(f"langchain.cache.{cache_type or settings.CACHE}")
logger.debug(f"Setting up LLM caching with {cache_class.__name__}")
langchain.llm_cache = cache_class()
diff --git a/src/backend/langflow/interface/vector_store/base.py b/src/backend/langflow/interface/vector_store/base.py
index 7ec1e0f5b..4b8ca2b64 100644
--- a/src/backend/langflow/interface/vector_store/base.py
+++ b/src/backend/langflow/interface/vector_store/base.py
@@ -4,7 +4,8 @@ from langchain import vectorstores
from langflow.interface.base import LangChainTypeCreator
from langflow.interface.importing.utils import import_class
-from langflow.settings import settings
+from langflow.services.utils import get_settings_manager
+
from langflow.template.frontend_node.vectorstores import VectorStoreFrontendNode
from langflow.utils.logger import logger
from langflow.utils.util import build_template_from_method
@@ -43,10 +44,12 @@ class VectorstoreCreator(LangChainTypeCreator):
return None
def to_list(self) -> List[str]:
+ settings_manager = get_settings_manager()
return [
vectorstore
for vectorstore in self.type_to_loader_dict.keys()
- if vectorstore in settings.vectorstores or settings.dev
+ if vectorstore in settings_manager.settings.VECTORSTORES
+ or settings_manager.settings.DEV
]
diff --git a/src/backend/langflow/main.py b/src/backend/langflow/main.py
index 31878f851..1702fb8f9 100644
--- a/src/backend/langflow/main.py
+++ b/src/backend/langflow/main.py
@@ -6,13 +6,17 @@ from fastapi.responses import FileResponse
from fastapi.staticfiles import StaticFiles
from langflow.api import router
-from langflow.database.base import create_db_and_tables
from langflow.interface.utils import setup_llm_caching
+from langflow.services.database.utils import initialize_database
+from langflow.services.manager import initialize_services
+from langflow.utils.logger import configure
def create_app():
"""Create the FastAPI app and include the router."""
+ configure()
+
app = FastAPI()
origins = [
@@ -30,9 +34,10 @@ def create_app():
allow_methods=["*"],
allow_headers=["*"],
)
-
app.include_router(router)
- app.on_event("startup")(create_db_and_tables)
+
+ app.on_event("startup")(initialize_services)
+ app.on_event("startup")(initialize_database)
app.on_event("startup")(setup_llm_caching)
return app
@@ -65,23 +70,32 @@ def get_static_files_dir():
return frontend_path / "frontend"
-def setup_app(static_files_dir: Optional[Path] = None) -> FastAPI:
+def setup_app(
+ static_files_dir: Optional[Path] = None, backend_only: bool = False
+) -> FastAPI:
"""Setup the FastAPI app."""
# get the directory of the current file
if not static_files_dir:
static_files_dir = get_static_files_dir()
- if not static_files_dir or not static_files_dir.exists():
+ if not backend_only and (not static_files_dir or not static_files_dir.exists()):
raise RuntimeError(f"Static files directory {static_files_dir} does not exist.")
app = create_app()
- setup_static_files(app, static_files_dir)
+ if not backend_only and static_files_dir is not None:
+ setup_static_files(app, static_files_dir)
return app
-app = create_app()
-
-
if __name__ == "__main__":
import uvicorn
+ from langflow.utils.util import get_number_of_workers
- uvicorn.run(app, host="127.0.0.1", port=7860)
+ configure()
+ uvicorn.run(
+ create_app,
+ host="127.0.0.1",
+ port=7860,
+ workers=get_number_of_workers(),
+ log_level="debug",
+ reload=True,
+ )
diff --git a/src/backend/langflow/processing/base.py b/src/backend/langflow/processing/base.py
index f8690bbdf..f1d7b6e56 100644
--- a/src/backend/langflow/processing/base.py
+++ b/src/backend/langflow/processing/base.py
@@ -22,7 +22,7 @@ async def get_result_and_steps(langchain_object, inputs: Union[dict, str], **kwa
try:
fix_memory_inputs(langchain_object)
except Exception as exc:
- logger.error(exc)
+ logger.error(f"Error fixing memory inputs: {exc}")
try:
async_callbacks = [AsyncStreamingLLMCallbackHandler(**kwargs)]
diff --git a/src/backend/langflow/processing/process.py b/src/backend/langflow/processing/process.py
index 03e6e4c35..8cefb1f44 100644
--- a/src/backend/langflow/processing/process.py
+++ b/src/backend/langflow/processing/process.py
@@ -85,12 +85,17 @@ def get_input_str_if_only_one_input(inputs: dict) -> Optional[str]:
return list(inputs.values())[0] if len(inputs) == 1 else None
-def process_graph_cached(data_graph: Dict[str, Any], inputs: Optional[dict] = None):
+def process_graph_cached(
+ data_graph: Dict[str, Any], inputs: Optional[dict] = None, clear_cache=False
+):
"""
Process graph by extracting input variables and replacing ZeroShotPrompt
with PromptTemplate,then run the graph and return the result and thought.
"""
# Load langchain object
+ if clear_cache:
+ build_sorted_vertices_with_caching.clear_cache()
+ logger.debug("Cleared cache")
langchain_object, artifacts = build_sorted_vertices_with_caching(data_graph)
logger.debug("Loaded LangChain object")
if inputs is None:
diff --git a/src/backend/langflow/services/__init__.py b/src/backend/langflow/services/__init__.py
new file mode 100644
index 000000000..8ac74b5b9
--- /dev/null
+++ b/src/backend/langflow/services/__init__.py
@@ -0,0 +1,4 @@
+from .manager import service_manager
+from .schema import ServiceType
+
+__all__ = ["service_manager", "ServiceType"]
diff --git a/src/backend/langflow/services/base.py b/src/backend/langflow/services/base.py
new file mode 100644
index 000000000..6bca6c4e2
--- /dev/null
+++ b/src/backend/langflow/services/base.py
@@ -0,0 +1,2 @@
+class Service:
+ name: str
diff --git a/src/backend/langflow/services/cache/__init__.py b/src/backend/langflow/services/cache/__init__.py
new file mode 100644
index 000000000..79e143807
--- /dev/null
+++ b/src/backend/langflow/services/cache/__init__.py
@@ -0,0 +1,11 @@
+from . import factory, manager
+from langflow.services.cache.manager import cache_manager
+from langflow.services.cache.flow import InMemoryCache
+
+
+__all__ = [
+ "cache_manager",
+ "factory",
+ "manager",
+ "InMemoryCache",
+]
diff --git a/src/backend/langflow/cache/base.py b/src/backend/langflow/services/cache/base.py
similarity index 100%
rename from src/backend/langflow/cache/base.py
rename to src/backend/langflow/services/cache/base.py
diff --git a/src/backend/langflow/services/cache/factory.py b/src/backend/langflow/services/cache/factory.py
new file mode 100644
index 000000000..77f8d58d1
--- /dev/null
+++ b/src/backend/langflow/services/cache/factory.py
@@ -0,0 +1,11 @@
+from langflow.services.cache.manager import CacheManager
+from langflow.services.factory import ServiceFactory
+
+
+class CacheManagerFactory(ServiceFactory):
+ def __init__(self):
+ super().__init__(CacheManager)
+
+ def create(self, settings_service):
+ # Here you would have logic to create and configure a CacheManager
+ return CacheManager()
diff --git a/src/backend/langflow/cache/flow.py b/src/backend/langflow/services/cache/flow.py
similarity index 98%
rename from src/backend/langflow/cache/flow.py
rename to src/backend/langflow/services/cache/flow.py
index 6d8fee977..0c10c51e1 100644
--- a/src/backend/langflow/cache/flow.py
+++ b/src/backend/langflow/services/cache/flow.py
@@ -2,7 +2,7 @@ import threading
import time
from collections import OrderedDict
-from langflow.cache.base import BaseCache
+from langflow.services.cache.base import BaseCache
class InMemoryCache(BaseCache):
diff --git a/src/backend/langflow/cache/manager.py b/src/backend/langflow/services/cache/manager.py
similarity index 97%
rename from src/backend/langflow/cache/manager.py
rename to src/backend/langflow/services/cache/manager.py
index 13b281008..ce9a338ef 100644
--- a/src/backend/langflow/cache/manager.py
+++ b/src/backend/langflow/services/cache/manager.py
@@ -1,5 +1,6 @@
from contextlib import contextmanager
from typing import Any, Awaitable, Callable, List, Optional
+from langflow.services.base import Service
import pandas as pd
from PIL import Image
@@ -49,9 +50,11 @@ class AsyncSubject:
await observer()
-class CacheManager(Subject):
+class CacheManager(Subject, Service):
"""Manages cache for different clients and notifies observers on changes."""
+ name = "cache_manager"
+
def __init__(self):
super().__init__()
self._cache = {}
diff --git a/src/backend/langflow/cache/utils.py b/src/backend/langflow/services/cache/utils.py
similarity index 100%
rename from src/backend/langflow/cache/utils.py
rename to src/backend/langflow/services/cache/utils.py
diff --git a/src/backend/langflow/services/chat/__init__.py b/src/backend/langflow/services/chat/__init__.py
new file mode 100644
index 000000000..e69de29bb
diff --git a/src/backend/langflow/chat/config.py b/src/backend/langflow/services/chat/config.py
similarity index 100%
rename from src/backend/langflow/chat/config.py
rename to src/backend/langflow/services/chat/config.py
diff --git a/src/backend/langflow/services/chat/factory.py b/src/backend/langflow/services/chat/factory.py
new file mode 100644
index 000000000..03597ed11
--- /dev/null
+++ b/src/backend/langflow/services/chat/factory.py
@@ -0,0 +1,11 @@
+from langflow.services.chat.manager import ChatManager
+from langflow.services.factory import ServiceFactory
+
+
+class ChatManagerFactory(ServiceFactory):
+ def __init__(self):
+ super().__init__(ChatManager)
+
+ def create(self, settings_service):
+ # Here you would have logic to create and configure a ChatManager
+ return ChatManager()
diff --git a/src/backend/langflow/chat/manager.py b/src/backend/langflow/services/chat/manager.py
similarity index 91%
rename from src/backend/langflow/chat/manager.py
rename to src/backend/langflow/services/chat/manager.py
index 33de784b5..a49f48273 100644
--- a/src/backend/langflow/chat/manager.py
+++ b/src/backend/langflow/services/chat/manager.py
@@ -1,10 +1,12 @@
from collections import defaultdict
from fastapi import WebSocket, status
from langflow.api.v1.schemas import ChatMessage, ChatResponse, FileResponse
-from langflow.cache import cache_manager
-from langflow.cache.manager import Subject
-from langflow.chat.utils import process_graph
+from langflow.services.base import Service
+from langflow.services import service_manager
+from langflow.services.cache.manager import Subject
+from langflow.services.chat.utils import process_graph
from langflow.interface.utils import pil_to_base64
+from langflow.services.schema import ServiceType
from langflow.utils.logger import logger
@@ -12,7 +14,7 @@ import asyncio
import json
from typing import Any, Dict, List
-from langflow.cache.flow import InMemoryCache
+from langflow.services.cache.flow import InMemoryCache
class ChatHistory(Subject):
@@ -42,11 +44,13 @@ class ChatHistory(Subject):
self.history[client_id] = []
-class ChatManager:
+class ChatManager(Service):
+ name = "chat_manager"
+
def __init__(self):
self.active_connections: Dict[str, WebSocket] = {}
self.chat_history = ChatHistory()
- self.cache_manager = cache_manager
+ self.cache_manager = service_manager.get(ServiceType.CACHE_MANAGER)
self.cache_manager.attach(self.update)
self.in_memory_cache = InMemoryCache()
@@ -111,13 +115,13 @@ class ChatManager:
# This is to catch the following error:
# Unexpected ASGI message 'websocket.close', after sending 'websocket.close'
if "after sending" in str(exc):
- logger.error(exc)
+ logger.error(f"Error closing connection: {exc}")
async def process_message(
self, client_id: str, payload: Dict, langchain_object: Any
):
# Process the graph data and chat message
- chat_inputs = payload.pop("inputs", "")
+ chat_inputs = payload.pop("inputs", {})
chat_inputs = ChatMessage(message=chat_inputs)
self.chat_history.add_message(client_id, chat_inputs)
@@ -197,13 +201,13 @@ class ChatManager:
langchain_object = self.in_memory_cache.get(client_id)
await self.process_message(client_id, payload, langchain_object)
- except Exception as e:
+ except Exception as exc:
# Handle any exceptions that might occur
- logger.error(e)
+ logger.error(f"Error handling websocket: {exc}")
await self.close_connection(
client_id=client_id,
code=status.WS_1011_INTERNAL_ERROR,
- reason=str(e)[:120],
+ reason=str(exc)[:120],
)
finally:
try:
@@ -212,6 +216,6 @@ class ChatManager:
code=status.WS_1000_NORMAL_CLOSURE,
reason="Client disconnected",
)
- except Exception as e:
- logger.error(e)
+ except Exception as exc:
+ logger.error(f"Error closing connection: {exc}")
self.disconnect(client_id)
diff --git a/src/backend/langflow/chat/utils.py b/src/backend/langflow/services/chat/utils.py
similarity index 93%
rename from src/backend/langflow/chat/utils.py
rename to src/backend/langflow/services/chat/utils.py
index 7db65b8e3..17c976eb9 100644
--- a/src/backend/langflow/chat/utils.py
+++ b/src/backend/langflow/services/chat/utils.py
@@ -21,9 +21,9 @@ async def process_graph(
# Generate result and thought
try:
- if not chat_inputs.message:
+ if chat_inputs.message is None:
logger.debug("No message provided")
- raise ValueError("No message provided")
+ chat_inputs.message = {}
logger.debug("Generating result and thought")
result, intermediate_steps = await get_result_and_steps(
diff --git a/src/backend/langflow/services/database/__init__.py b/src/backend/langflow/services/database/__init__.py
new file mode 100644
index 000000000..e69de29bb
diff --git a/src/backend/langflow/services/database/factory.py b/src/backend/langflow/services/database/factory.py
new file mode 100644
index 000000000..fecf24543
--- /dev/null
+++ b/src/backend/langflow/services/database/factory.py
@@ -0,0 +1,17 @@
+from typing import TYPE_CHECKING
+from langflow.services.database.manager import DatabaseManager
+from langflow.services.factory import ServiceFactory
+
+if TYPE_CHECKING:
+ from langflow.services.settings.manager import SettingsManager
+
+
+class DatabaseManagerFactory(ServiceFactory):
+ def __init__(self):
+ super().__init__(DatabaseManager)
+
+ def create(self, settings_service: "SettingsManager"):
+ # Here you would have logic to create and configure a DatabaseManager
+ if not settings_service.settings.DATABASE_URL:
+ raise ValueError("No database URL provided")
+ return DatabaseManager(settings_service.settings.DATABASE_URL)
diff --git a/src/backend/langflow/services/database/manager.py b/src/backend/langflow/services/database/manager.py
new file mode 100644
index 000000000..92385a457
--- /dev/null
+++ b/src/backend/langflow/services/database/manager.py
@@ -0,0 +1,67 @@
+from pathlib import Path
+from langflow.services.base import Service
+from sqlmodel import SQLModel, Session, create_engine
+from langflow.utils.logger import logger
+from alembic.config import Config
+from alembic import command
+from langflow.services.database import models # noqa
+
+
+class DatabaseManager(Service):
+ name = "database_manager"
+
+ def __init__(self, database_url: str):
+ self.database_url = database_url
+ # This file is in langflow.services.database.manager.py
+ # the ini is in langflow
+ langflow_dir = Path(__file__).parent.parent.parent
+ self.script_location = langflow_dir / "alembic"
+ self.alembic_cfg_path = langflow_dir / "alembic.ini"
+ self.engine = create_engine(database_url)
+
+ def __enter__(self):
+ self._session = Session(self.engine)
+ return self._session
+
+ def __exit__(self, exc_type, exc_value, traceback):
+ if exc_type is not None: # If an exception has been raised
+ logger.error(
+ f"Session rollback because of exception: {exc_type.__name__} {exc_value}"
+ )
+ self._session.rollback()
+ else:
+ self._session.commit()
+ self._session.close()
+
+ def get_session(self):
+ with Session(self.engine) as session:
+ yield session
+
+ def run_migrations(self):
+ logger.info(
+ f"Running DB migrations in {self.script_location} on {self.database_url}"
+ )
+ alembic_cfg = Config()
+ alembic_cfg.set_main_option("script_location", str(self.script_location))
+ alembic_cfg.set_main_option("sqlalchemy.url", self.database_url)
+ command.upgrade(alembic_cfg, "head")
+
+ def create_db_and_tables(self):
+ logger.debug("Creating database and tables")
+ try:
+ SQLModel.metadata.create_all(self.engine)
+ except Exception as exc:
+ logger.error(f"Error creating database and tables: {exc}")
+ raise RuntimeError("Error creating database and tables") from exc
+
+ # Now check if the table "flow" exists, if not, something went wrong
+ # and we need to create the tables again.
+ from sqlalchemy import inspect
+
+ inspector = inspect(self.engine)
+ if "flow" not in inspector.get_table_names():
+ logger.error("Something went wrong creating the database and tables.")
+ logger.error("Please check your database settings.")
+ raise RuntimeError("Something went wrong creating the database and tables.")
+ else:
+ logger.debug("Database and tables created successfully")
diff --git a/src/backend/langflow/services/database/models/__init__.py b/src/backend/langflow/services/database/models/__init__.py
new file mode 100644
index 000000000..da47bc5fe
--- /dev/null
+++ b/src/backend/langflow/services/database/models/__init__.py
@@ -0,0 +1,4 @@
+from .flow import Flow
+
+
+__all__ = ["Flow"]
diff --git a/src/backend/langflow/database/models/base.py b/src/backend/langflow/services/database/models/base.py
similarity index 100%
rename from src/backend/langflow/database/models/base.py
rename to src/backend/langflow/services/database/models/base.py
diff --git a/src/backend/langflow/services/database/models/component.py b/src/backend/langflow/services/database/models/component.py
new file mode 100644
index 000000000..5c4e6c13a
--- /dev/null
+++ b/src/backend/langflow/services/database/models/component.py
@@ -0,0 +1,29 @@
+from langflow.services.database.models.base import SQLModelSerializable, SQLModel
+from sqlmodel import Field
+from typing import Optional
+from datetime import datetime
+import uuid
+
+
+class Component(SQLModelSerializable, table=True):
+ id: uuid.UUID = Field(default_factory=uuid.uuid4, primary_key=True)
+ frontend_node_id: uuid.UUID = Field(index=True)
+ name: str = Field(index=True)
+ description: Optional[str] = Field(default=None)
+ python_code: Optional[str] = Field(default=None)
+ return_type: Optional[str] = Field(default=None)
+ is_disabled: bool = Field(default=False)
+ is_read_only: bool = Field(default=False)
+ create_at: datetime = Field(default_factory=datetime.utcnow)
+ update_at: datetime = Field(default_factory=datetime.utcnow)
+
+
+class ComponentModel(SQLModel):
+ id: uuid.UUID = Field(default_factory=uuid.uuid4)
+ frontend_node_id: uuid.UUID = Field(default=uuid.uuid4())
+ name: str = Field(default="")
+ description: Optional[str] = None
+ python_code: Optional[str] = None
+ return_type: Optional[str] = None
+ is_disabled: bool = False
+ is_read_only: bool = False
diff --git a/src/backend/langflow/database/models/flow.py b/src/backend/langflow/services/database/models/flow.py
similarity index 72%
rename from src/backend/langflow/database/models/flow.py
rename to src/backend/langflow/services/database/models/flow.py
index f9e3aa249..2bc83f9dc 100644
--- a/src/backend/langflow/database/models/flow.py
+++ b/src/backend/langflow/services/database/models/flow.py
@@ -1,13 +1,12 @@
# Path: src/backend/langflow/database/models/flow.py
-from langflow.database.models.base import SQLModelSerializable
+from langflow.services.database.models.base import SQLModelSerializable
from pydantic import validator
-from sqlmodel import Field, Relationship, JSON, Column
+from sqlmodel import Field, JSON, Column
from uuid import UUID, uuid4
from typing import Dict, Optional
# if TYPE_CHECKING:
-from langflow.database.models.flow_style import FlowStyle, FlowStyleRead
class FlowBase(SQLModelSerializable):
@@ -35,11 +34,6 @@ class FlowBase(SQLModelSerializable):
class Flow(FlowBase, table=True):
id: UUID = Field(default_factory=uuid4, primary_key=True, unique=True)
data: Optional[Dict] = Field(default=None, sa_column=Column(JSON))
- style: Optional["FlowStyle"] = Relationship(
- back_populates="flow",
- # use "uselist=False" to make it a one-to-one relationship
- sa_relationship_kwargs={"uselist": False},
- )
class FlowCreate(FlowBase):
@@ -50,10 +44,6 @@ class FlowRead(FlowBase):
id: UUID
-class FlowReadWithStyle(FlowRead):
- style: Optional["FlowStyleRead"] = None
-
-
class FlowUpdate(SQLModelSerializable):
name: Optional[str] = None
description: Optional[str] = None
diff --git a/src/backend/langflow/services/database/utils.py b/src/backend/langflow/services/database/utils.py
new file mode 100644
index 000000000..94bcd6651
--- /dev/null
+++ b/src/backend/langflow/services/database/utils.py
@@ -0,0 +1,47 @@
+from typing import TYPE_CHECKING
+from langflow.utils.logger import logger
+from contextlib import contextmanager
+from alembic.util.exc import CommandError
+from sqlmodel import Session
+
+if TYPE_CHECKING:
+ from langflow.services.database.manager import DatabaseManager
+
+
+def initialize_database():
+ logger.debug("Initializing database")
+ from langflow.services import service_manager, ServiceType
+
+ database_manager = service_manager.get(ServiceType.DATABASE_MANAGER)
+ try:
+ database_manager.run_migrations()
+ except CommandError as exc:
+ if "Can't locate revision identified by" not in str(exc):
+ raise exc
+ # This means there's wrong revision in the DB
+ # We need to delete the alembic_version table
+ # and run the migrations again
+ logger.warning(
+ "Wrong revision in DB, deleting alembic_version table and running migrations again"
+ )
+ with session_getter(database_manager) as session:
+ session.execute("DROP TABLE alembic_version")
+ database_manager.run_migrations()
+ except Exception as exc:
+ logger.error(f"Error running migrations: {exc}")
+ raise RuntimeError("Error running migrations") from exc
+ database_manager.create_db_and_tables()
+ logger.debug("Database initialized")
+
+
+@contextmanager
+def session_getter(db_manager: "DatabaseManager"):
+ try:
+ session = Session(db_manager.engine)
+ yield session
+ except Exception as e:
+ print("Session rollback because of exception:", e)
+ session.rollback()
+ raise
+ finally:
+ session.close()
diff --git a/src/backend/langflow/services/factory.py b/src/backend/langflow/services/factory.py
new file mode 100644
index 000000000..c37f4e9c2
--- /dev/null
+++ b/src/backend/langflow/services/factory.py
@@ -0,0 +1,6 @@
+class ServiceFactory:
+ def __init__(self, service_class):
+ self.service_class = service_class
+
+ def create(self, *args, **kwargs):
+ raise NotImplementedError
diff --git a/src/backend/langflow/services/manager.py b/src/backend/langflow/services/manager.py
new file mode 100644
index 000000000..1606b3a82
--- /dev/null
+++ b/src/backend/langflow/services/manager.py
@@ -0,0 +1,87 @@
+from langflow.services.schema import ServiceType
+from typing import TYPE_CHECKING
+
+if TYPE_CHECKING:
+ from langflow.services.factory import ServiceFactory
+
+
+class ServiceManager:
+ """
+ Manages the creation of different services.
+ """
+
+ def __init__(self):
+ self.services = {}
+ self.factories = {}
+
+ def register_factory(self, service_factory: "ServiceFactory"):
+ """
+ Registers a new factory.
+ """
+ self.factories[service_factory.service_class.name] = service_factory
+
+ def get(self, service_name: ServiceType):
+ """
+ Get (or create) a service by its name.
+ """
+ if service_name not in self.services:
+ self._create_service(service_name)
+
+ return self.services[service_name]
+
+ def _create_service(self, service_name: ServiceType):
+ """
+ Create a new service given its name.
+ """
+ self._validate_service_creation(service_name)
+
+ if service_name == ServiceType.SETTINGS_MANAGER:
+ self.services[service_name] = self.factories[service_name].create()
+ else:
+ settings_service = self.get(ServiceType.SETTINGS_MANAGER)
+ self.services[service_name] = self.factories[service_name].create(
+ settings_service
+ )
+
+ def _validate_service_creation(self, service_name: ServiceType):
+ """
+ Validate whether the service can be created.
+ """
+ if service_name not in self.factories:
+ raise ValueError(
+ f"No factory registered for the service class '{service_name.name}'"
+ )
+
+ if (
+ ServiceType.SETTINGS_MANAGER not in self.factories
+ and service_name != ServiceType.SETTINGS_MANAGER
+ ):
+ raise ValueError(
+ f"Cannot create service '{service_name.name}' before the settings service"
+ )
+
+ def update(self, service_name: ServiceType):
+ """
+ Update a service by its name.
+ """
+ if service_name in self.services:
+ self.services.pop(service_name, None)
+ self.get(service_name)
+
+
+service_manager = ServiceManager()
+
+
+def initialize_services():
+ """
+ Initialize all the services needed.
+ """
+ from langflow.services.database import factory as database_factory
+ from langflow.services.cache import factory as cache_factory
+ from langflow.services.chat import factory as chat_factory
+ from langflow.services.settings import factory as settings_factory
+
+ service_manager.register_factory(settings_factory.SettingsManagerFactory())
+ service_manager.register_factory(database_factory.DatabaseManagerFactory())
+ service_manager.register_factory(cache_factory.CacheManagerFactory())
+ service_manager.register_factory(chat_factory.ChatManagerFactory())
diff --git a/src/backend/langflow/services/schema.py b/src/backend/langflow/services/schema.py
new file mode 100644
index 000000000..695763afc
--- /dev/null
+++ b/src/backend/langflow/services/schema.py
@@ -0,0 +1,13 @@
+from enum import Enum
+
+
+class ServiceType(str, Enum):
+ """
+ Enum for the different types of services that can be
+ registered with the service manager.
+ """
+
+ CACHE_MANAGER = "cache_manager"
+ SETTINGS_MANAGER = "settings_manager"
+ DATABASE_MANAGER = "database_manager"
+ CHAT_MANAGER = "chat_manager"
diff --git a/src/backend/langflow/services/settings/__init__.py b/src/backend/langflow/services/settings/__init__.py
new file mode 100644
index 000000000..2191bf2cc
--- /dev/null
+++ b/src/backend/langflow/services/settings/__init__.py
@@ -0,0 +1,3 @@
+from . import factory, manager
+
+__all__ = ["factory", "manager"]
diff --git a/src/backend/langflow/services/settings/base.py b/src/backend/langflow/services/settings/base.py
new file mode 100644
index 000000000..1eb2793b3
--- /dev/null
+++ b/src/backend/langflow/services/settings/base.py
@@ -0,0 +1,168 @@
+import contextlib
+import json
+import os
+from typing import Optional, List
+from pathlib import Path
+
+import yaml
+from pydantic import BaseSettings, root_validator, validator
+from langflow.utils.logger import logger
+
+BASE_COMPONENTS_PATH = str(Path(__file__).parent / "components")
+
+
+class Settings(BaseSettings):
+ CHAINS: dict = {}
+ AGENTS: dict = {}
+ PROMPTS: dict = {}
+ LLMS: dict = {}
+ TOOLS: dict = {}
+ MEMORIES: dict = {}
+ EMBEDDINGS: dict = {}
+ VECTORSTORES: dict = {}
+ DOCUMENTLOADERS: dict = {}
+ WRAPPERS: dict = {}
+ RETRIEVERS: dict = {}
+ TOOLKITS: dict = {}
+ TEXTSPLITTERS: dict = {}
+ UTILITIES: dict = {}
+ OUTPUT_PARSERS: dict = {}
+ CUSTOM_COMPONENTS: dict = {}
+
+ DEV: bool = False
+ DATABASE_URL: Optional[str] = None
+ CACHE: str = "InMemoryCache"
+ REMOVE_API_KEYS: bool = False
+ COMPONENTS_PATH: List[str] = []
+
+ @validator("DATABASE_URL", pre=True)
+ def set_database_url(cls, value):
+ if not value:
+ logger.debug(
+ "No database_url provided, trying LANGFLOW_DATABASE_URL env variable"
+ )
+ if langflow_database_url := os.getenv("LANGFLOW_DATABASE_URL"):
+ value = langflow_database_url
+ logger.debug("Using LANGFLOW_DATABASE_URL env variable.")
+ else:
+ logger.debug("No DATABASE_URL env variable, using sqlite database")
+ value = "sqlite:///./langflow.db"
+
+ return value
+
+ @validator("COMPONENTS_PATH", pre=True)
+ def set_components_path(cls, value):
+ if os.getenv("LANGFLOW_COMPONENTS_PATH"):
+ logger.debug("Adding LANGFLOW_COMPONENTS_PATH to components_path")
+ langflow_component_path = os.getenv("LANGFLOW_COMPONENTS_PATH")
+ if (
+ Path(langflow_component_path).exists()
+ and langflow_component_path not in value
+ ):
+ if isinstance(langflow_component_path, list):
+ for path in langflow_component_path:
+ if path not in value:
+ value.append(path)
+ logger.debug(
+ f"Extending {langflow_component_path} to components_path"
+ )
+ elif langflow_component_path not in value:
+ value.append(langflow_component_path)
+ logger.debug(
+ f"Appending {langflow_component_path} to components_path"
+ )
+
+ if not value:
+ value = [BASE_COMPONENTS_PATH]
+ logger.debug("Setting default components path to components_path")
+ elif BASE_COMPONENTS_PATH not in value:
+ value.append(BASE_COMPONENTS_PATH)
+ logger.debug("Adding default components path to components_path")
+
+ logger.debug(f"Components path: {value}")
+ return value
+
+ class Config:
+ validate_assignment = True
+ extra = "ignore"
+ env_prefix = "LANGFLOW_"
+
+ @root_validator(allow_reuse=True)
+ def validate_lists(cls, values):
+ for key, value in values.items():
+ if key != "dev" and not value:
+ values[key] = []
+ return values
+
+ def update_from_yaml(self, file_path: str, dev: bool = False):
+ new_settings = load_settings_from_yaml(file_path)
+ self.CHAINS = new_settings.CHAINS or {}
+ self.AGENTS = new_settings.AGENTS or {}
+ self.PROMPTS = new_settings.PROMPTS or {}
+ self.LLMS = new_settings.LLMS or {}
+ self.TOOLS = new_settings.TOOLS or {}
+ self.MEMORIES = new_settings.MEMORIES or {}
+ self.WRAPPERS = new_settings.WRAPPERS or {}
+ self.TOOLKITS = new_settings.TOOLKITS or {}
+ self.TEXTSPLITTERS = new_settings.TEXTSPLITTERS or {}
+ self.UTILITIES = new_settings.UTILITIES or {}
+ self.EMBEDDINGS = new_settings.EMBEDDINGS or {}
+ self.VECTORSTORES = new_settings.VECTORSTORES or {}
+ self.DOCUMENTLOADERS = new_settings.DOCUMENTLOADERS or {}
+ self.RETRIEVERS = new_settings.RETRIEVERS or {}
+ self.OUTPUT_PARSERS = new_settings.OUTPUT_PARSERS or {}
+ self.CUSTOM_COMPONENTS = new_settings.CUSTOM_COMPONENTS or {}
+ self.COMPONENTS_PATH = new_settings.COMPONENTS_PATH or []
+ self.DEV = dev
+
+ def update_settings(self, **kwargs):
+ logger.debug("Updating settings")
+ for key, value in kwargs.items():
+ # value may contain sensitive information, so we don't want to log it
+ if not hasattr(self, key):
+ logger.debug(f"Key {key} not found in settings")
+ continue
+ logger.debug(f"Updating {key}")
+ if isinstance(getattr(self, key), list):
+ # value might be a '[something]' string
+ with contextlib.suppress(json.decoder.JSONDecodeError):
+ value = json.loads(str(value))
+ if isinstance(value, list):
+ for item in value:
+ if item not in getattr(self, key):
+ getattr(self, key).append(item)
+ logger.debug(f"Extended {key}")
+ else:
+ getattr(self, key).append(value)
+ logger.debug(f"Appended {key}")
+
+ else:
+ setattr(self, key, value)
+ logger.debug(f"Updated {key}")
+ logger.debug(f"{key}: {getattr(self, key)}")
+
+
+def save_settings_to_yaml(settings: Settings, file_path: str):
+ with open(file_path, "w") as f:
+ settings_dict = settings.dict()
+ yaml.dump(settings_dict, f)
+
+
+def load_settings_from_yaml(file_path: str) -> Settings:
+ # Check if a string is a valid path or a file name
+ if "/" not in file_path:
+ # Get current path
+ current_path = os.path.dirname(os.path.abspath(__file__))
+
+ file_path = os.path.join(current_path, file_path)
+
+ with open(file_path, "r") as f:
+ settings_dict = yaml.safe_load(f)
+ settings_dict = {k.upper(): v for k, v in settings_dict.items()}
+
+ for key in settings_dict:
+ if key not in Settings.__fields__.keys():
+ raise KeyError(f"Key {key} not found in settings")
+ logger.debug(f"Loading {len(settings_dict[key])} {key} from {file_path}")
+
+ return Settings(**settings_dict)
diff --git a/src/backend/langflow/services/settings/factory.py b/src/backend/langflow/services/settings/factory.py
new file mode 100644
index 000000000..ab22e22b8
--- /dev/null
+++ b/src/backend/langflow/services/settings/factory.py
@@ -0,0 +1,15 @@
+from pathlib import Path
+from langflow.services.settings.manager import SettingsManager
+from langflow.services.factory import ServiceFactory
+
+
+class SettingsManagerFactory(ServiceFactory):
+ def __init__(self):
+ super().__init__(SettingsManager)
+
+ def create(self):
+ # Here you would have logic to create and configure a SettingsManager
+ langflow_dir = Path(__file__).parent.parent.parent
+ return SettingsManager.load_settings_from_yaml(
+ str(langflow_dir / "config.yaml")
+ )
diff --git a/src/backend/langflow/services/settings/manager.py b/src/backend/langflow/services/settings/manager.py
new file mode 100644
index 000000000..a357c4804
--- /dev/null
+++ b/src/backend/langflow/services/settings/manager.py
@@ -0,0 +1,36 @@
+from langflow.services.base import Service
+from langflow.services.settings.base import Settings
+from langflow.utils.logger import logger
+import os
+import yaml
+
+
+class SettingsManager(Service):
+ name = "settings_manager"
+
+ def __init__(self, settings: Settings):
+ super().__init__()
+ self.settings = settings
+
+ @classmethod
+ def load_settings_from_yaml(cls, file_path: str) -> "SettingsManager":
+ # Check if a string is a valid path or a file name
+ if "/" not in file_path:
+ # Get current path
+ current_path = os.path.dirname(os.path.abspath(__file__))
+
+ file_path = os.path.join(current_path, file_path)
+
+ with open(file_path, "r") as f:
+ settings_dict = yaml.safe_load(f)
+ settings_dict = {k.upper(): v for k, v in settings_dict.items()}
+
+ for key in settings_dict:
+ if key not in Settings.__fields__.keys():
+ raise KeyError(f"Key {key} not found in settings")
+ logger.debug(
+ f"Loading {len(settings_dict[key])} {key} from {file_path}"
+ )
+
+ settings = Settings(**settings_dict)
+ return cls(settings)
diff --git a/src/backend/langflow/services/settings/settings.py b/src/backend/langflow/services/settings/settings.py
new file mode 100644
index 000000000..439b3a1e4
--- /dev/null
+++ b/src/backend/langflow/services/settings/settings.py
@@ -0,0 +1,171 @@
+import contextlib
+import json
+import os
+from typing import Optional, List
+from pathlib import Path
+
+import yaml
+from pydantic import BaseSettings, root_validator, validator
+from langflow.utils.logger import logger
+
+BASE_COMPONENTS_PATH = str(Path(__file__).parent / "components")
+
+
+class Settings(BaseSettings):
+ CHAINS: dict = {}
+ AGENTS: dict = {}
+ PROMPTS: dict = {}
+ LLMS: dict = {}
+ TOOLS: dict = {}
+ MEMORIES: dict = {}
+ EMBEDDINGS: dict = {}
+ VECTORSTORES: dict = {}
+ DOCUMENTLOADERS: dict = {}
+ WRAPPERS: dict = {}
+ RETRIEVERS: dict = {}
+ TOOLKITS: dict = {}
+ TEXTSPLITTERS: dict = {}
+ UTILITIES: dict = {}
+ OUTPUT_PARSERS: dict = {}
+ CUSTOM_COMPONENTS: dict = {}
+
+ DEV: bool = False
+ DATABASE_URL: Optional[str] = None
+ CACHE: str = "InMemoryCache"
+ REMOVE_API_KEYS: bool = False
+ COMPONENTS_PATH: List[str] = []
+
+ @validator("DATABASE_URL", pre=True)
+ def set_database_url(cls, value):
+ if not value:
+ logger.debug(
+ "No database_url provided, trying LANGFLOW_DATABASE_URL env variable"
+ )
+ if langflow_database_url := os.getenv("LANGFLOW_DATABASE_URL"):
+ value = langflow_database_url
+ logger.debug("Using LANGFLOW_DATABASE_URL env variable.")
+ else:
+ logger.debug("No DATABASE_URL env variable, using sqlite database")
+ value = "sqlite:///./langflow.db"
+
+ return value
+
+ @validator("COMPONENTS_PATH", pre=True)
+ def set_components_path(cls, value):
+ if os.getenv("LANGFLOW_COMPONENTS_PATH"):
+ logger.debug("Adding LANGFLOW_COMPONENTS_PATH to components_path")
+ langflow_component_path = os.getenv("LANGFLOW_COMPONENTS_PATH")
+ if (
+ Path(langflow_component_path).exists()
+ and langflow_component_path not in value
+ ):
+ if isinstance(langflow_component_path, list):
+ for path in langflow_component_path:
+ if path not in value:
+ value.append(path)
+ logger.debug(
+ f"Extending {langflow_component_path} to components_path"
+ )
+ elif langflow_component_path not in value:
+ value.append(langflow_component_path)
+ logger.debug(
+ f"Appending {langflow_component_path} to components_path"
+ )
+
+ if not value:
+ value = [BASE_COMPONENTS_PATH]
+ logger.debug("Setting default components path to components_path")
+ elif BASE_COMPONENTS_PATH not in value:
+ value.append(BASE_COMPONENTS_PATH)
+ logger.debug("Adding default components path to components_path")
+
+ logger.debug(f"Components path: {value}")
+ return value
+
+ class Config:
+ validate_assignment = True
+ extra = "ignore"
+ env_prefix = "LANGFLOW_"
+
+ @root_validator(allow_reuse=True)
+ def validate_lists(cls, values):
+ for key, value in values.items():
+ if key != "dev" and not value:
+ values[key] = []
+ return values
+
+ def update_from_yaml(self, file_path: str, dev: bool = False):
+ new_settings = load_settings_from_yaml(file_path)
+ self.CHAINS = new_settings.CHAINS or {}
+ self.AGENTS = new_settings.AGENTS or {}
+ self.PROMPTS = new_settings.PROMPTS or {}
+ self.LLMS = new_settings.LLMS or {}
+ self.TOOLS = new_settings.TOOLS or {}
+ self.MEMORIES = new_settings.MEMORIES or {}
+ self.WRAPPERS = new_settings.WRAPPERS or {}
+ self.TOOLKITS = new_settings.TOOLKITS or {}
+ self.TEXTSPLITTERS = new_settings.TEXTSPLITTERS or {}
+ self.UTILITIES = new_settings.UTILITIES or {}
+ self.EMBEDDINGS = new_settings.EMBEDDINGS or {}
+ self.VECTORSTORES = new_settings.VECTORSTORES or {}
+ self.DOCUMENTLOADERS = new_settings.DOCUMENTLOADERS or {}
+ self.RETRIEVERS = new_settings.RETRIEVERS or {}
+ self.OUTPUT_PARSERS = new_settings.OUTPUT_PARSERS or {}
+ self.CUSTOM_COMPONENTS = new_settings.CUSTOM_COMPONENTS or {}
+ self.COMPONENTS_PATH = new_settings.COMPONENTS_PATH or []
+ self.DEV = dev
+
+ def update_settings(self, **kwargs):
+ logger.debug("Updating settings")
+ for key, value in kwargs.items():
+ # value may contain sensitive information, so we don't want to log it
+ if not hasattr(self, key):
+ logger.debug(f"Key {key} not found in settings")
+ continue
+ logger.debug(f"Updating {key}")
+ if isinstance(getattr(self, key), list):
+ # value might be a '[something]' string
+ with contextlib.suppress(json.decoder.JSONDecodeError):
+ value = json.loads(str(value))
+ if isinstance(value, list):
+ for item in value:
+ if item not in getattr(self, key):
+ getattr(self, key).append(item)
+ logger.debug(f"Extended {key}")
+ else:
+ getattr(self, key).append(value)
+ logger.debug(f"Appended {key}")
+
+ else:
+ setattr(self, key, value)
+ logger.debug(f"Updated {key}")
+ logger.debug(f"{key}: {getattr(self, key)}")
+
+
+def save_settings_to_yaml(settings: Settings, file_path: str):
+ with open(file_path, "w") as f:
+ settings_dict = settings.dict()
+ yaml.dump(settings_dict, f)
+
+
+def load_settings_from_yaml(file_path: str) -> Settings:
+ # Check if a string is a valid path or a file name
+ if "/" not in file_path:
+ # Get current path
+ current_path = os.path.dirname(os.path.abspath(__file__))
+
+ file_path = os.path.join(current_path, file_path)
+
+ with open(file_path, "r") as f:
+ settings_dict = yaml.safe_load(f)
+ settings_dict = {k.upper(): v for k, v in settings_dict.items()}
+
+ for key in settings_dict:
+ if key not in Settings.__fields__.keys():
+ raise KeyError(f"Key {key} not found in settings")
+ logger.debug(f"Loading {len(settings_dict[key])} {key} from {file_path}")
+
+ return Settings(**settings_dict)
+
+
+settings = load_settings_from_yaml("config.yaml")
diff --git a/src/backend/langflow/services/utils.py b/src/backend/langflow/services/utils.py
new file mode 100644
index 000000000..049e82c0f
--- /dev/null
+++ b/src/backend/langflow/services/utils.py
@@ -0,0 +1,18 @@
+from langflow.services import ServiceType, service_manager
+from typing import TYPE_CHECKING
+
+if TYPE_CHECKING:
+ from langflow.services.settings.manager import SettingsManager
+
+
+def get_settings_manager() -> "SettingsManager":
+ return service_manager.get(ServiceType.SETTINGS_MANAGER)
+
+
+def get_db_manager():
+ return service_manager.get(ServiceType.DATABASE_MANAGER)
+
+
+def get_session():
+ db_manager = service_manager.get(ServiceType.DATABASE_MANAGER)
+ yield from db_manager.get_session()
diff --git a/src/backend/langflow/settings.py b/src/backend/langflow/settings.py
deleted file mode 100644
index 9e6c60082..000000000
--- a/src/backend/langflow/settings.py
+++ /dev/null
@@ -1,99 +0,0 @@
-import os
-from typing import Optional
-
-import yaml
-from pydantic import BaseSettings, root_validator
-from langflow.utils.logger import logger
-
-
-class Settings(BaseSettings):
- chains: dict = {}
- agents: dict = {}
- prompts: dict = {}
- llms: dict = {}
- tools: dict = {}
- memories: dict = {}
- embeddings: dict = {}
- vectorstores: dict = {}
- documentloaders: dict = {}
- wrappers: dict = {}
- retrievers: dict = {}
- toolkits: dict = {}
- textsplitters: dict = {}
- utilities: dict = {}
- output_parsers: dict = {}
- dev: bool = False
- database_url: Optional[str] = None
- cache: str = "InMemoryCache"
- remove_api_keys: bool = False
-
- @root_validator(pre=True)
- def set_database_url(cls, values):
- if "database_url" not in values:
- logger.debug(
- "No database_url provided, trying LANGFLOW_DATABASE_URL env variable"
- )
- if langflow_database_url := os.getenv("LANGFLOW_DATABASE_URL"):
- values["database_url"] = langflow_database_url
- else:
- logger.debug("No DATABASE_URL env variable, using sqlite database")
- values["database_url"] = "sqlite:///./langflow.db"
- return values
-
- class Config:
- validate_assignment = True
- extra = "ignore"
-
- @root_validator(allow_reuse=True)
- def validate_lists(cls, values):
- for key, value in values.items():
- if key != "dev" and not value:
- values[key] = []
- return values
-
- def update_from_yaml(self, file_path: str, dev: bool = False):
- new_settings = load_settings_from_yaml(file_path)
- self.chains = new_settings.chains or {}
- self.agents = new_settings.agents or {}
- self.prompts = new_settings.prompts or {}
- self.llms = new_settings.llms or {}
- self.tools = new_settings.tools or {}
- self.memories = new_settings.memories or {}
- self.wrappers = new_settings.wrappers or {}
- self.toolkits = new_settings.toolkits or {}
- self.textsplitters = new_settings.textsplitters or {}
- self.utilities = new_settings.utilities or {}
- self.embeddings = new_settings.embeddings or {}
- self.vectorstores = new_settings.vectorstores or {}
- self.documentloaders = new_settings.documentloaders or {}
- self.retrievers = new_settings.retrievers or {}
- self.output_parsers = new_settings.output_parsers or {}
- self.dev = dev
-
- def update_settings(self, **kwargs):
- for key, value in kwargs.items():
- if hasattr(self, key):
- setattr(self, key, value)
-
-
-def save_settings_to_yaml(settings: Settings, file_path: str):
- with open(file_path, "w") as f:
- settings_dict = settings.dict()
- yaml.dump(settings_dict, f)
-
-
-def load_settings_from_yaml(file_path: str) -> Settings:
- # Check if a string is a valid path or a file name
- if "/" not in file_path:
- # Get current path
- current_path = os.path.dirname(os.path.abspath(__file__))
-
- file_path = os.path.join(current_path, file_path)
-
- with open(file_path, "r") as f:
- settings_dict = yaml.safe_load(f)
-
- return Settings(**settings_dict)
-
-
-settings = load_settings_from_yaml("config.yaml")
diff --git a/src/backend/langflow/template/field/base.py b/src/backend/langflow/template/field/base.py
index a747ad322..31c68d094 100644
--- a/src/backend/langflow/template/field/base.py
+++ b/src/backend/langflow/template/field/base.py
@@ -6,23 +6,58 @@ from pydantic import BaseModel
class TemplateFieldCreator(BaseModel, ABC):
field_type: str = "str"
+ """The type of field this is. Default is a string."""
+
required: bool = False
+ """Specifies if the field is required. Defaults to False."""
+
placeholder: str = ""
+ """A placeholder string for the field. Default is an empty string."""
+
is_list: bool = False
+ """Defines if the field is a list. Default is False."""
+
show: bool = True
+ """Should the field be shown. Defaults to True."""
+
multiline: bool = False
+ """Defines if the field will allow the user to open a text editor. Default is False."""
+
value: Any = None
+ """The value of the field. Default is None."""
+
suffixes: list[str] = []
- fileTypes: list[str] = []
+ """List of suffixes for a file field. Default is an empty list."""
+
file_types: list[str] = []
+ """List of file types associated with the field. Default is an empty list. (duplicate)"""
+
file_path: Union[str, None] = None
+ """The file path of the field if it is a file. Defaults to None."""
+
password: bool = False
+ """Specifies if the field is a password. Defaults to False."""
+
options: list[str] = []
+ """List of options for the field. Only used when is_list=True. Default is an empty list."""
+
name: str = ""
+ """Name of the field. Default is an empty string."""
+
display_name: Optional[str] = None
+ """Display name of the field. Defaults to None."""
+
advanced: bool = False
+ """Specifies if the field will an advanced parameter (hidden). Defaults to False."""
+
input_types: list[str] = []
+ """List of input types for the handle when the field has more than one type. Default is an empty list."""
+
+ dynamic: bool = False
+ """Specifies if the field is dynamic. Defaults to False."""
+
info: Optional[str] = ""
+ """Additional information about the field to be shown in the tooltip. Defaults to an empty string."""
def to_dict(self):
result = self.dict()
diff --git a/src/backend/langflow/template/frontend_node/__init__.py b/src/backend/langflow/template/frontend_node/__init__.py
index c36234364..e13aa1ded 100644
--- a/src/backend/langflow/template/frontend_node/__init__.py
+++ b/src/backend/langflow/template/frontend_node/__init__.py
@@ -9,6 +9,7 @@ from langflow.template.frontend_node import (
vectorstores,
documentloaders,
textsplitters,
+ custom_components,
)
__all__ = [
@@ -22,4 +23,5 @@ __all__ = [
"vectorstores",
"documentloaders",
"textsplitters",
+ "custom_components",
]
diff --git a/src/backend/langflow/template/frontend_node/agents.py b/src/backend/langflow/template/frontend_node/agents.py
index 02aea78b9..63c8a4d5e 100644
--- a/src/backend/langflow/template/frontend_node/agents.py
+++ b/src/backend/langflow/template/frontend_node/agents.py
@@ -145,7 +145,7 @@ class CSVAgentNode(FrontendNode):
name="path",
value="",
suffixes=[".csv"],
- fileTypes=["csv"],
+ file_types=["csv"],
),
TemplateField(
field_type="BaseLanguageModel",
diff --git a/src/backend/langflow/template/frontend_node/base.py b/src/backend/langflow/template/frontend_node/base.py
index 7dae45463..fe19b5652 100644
--- a/src/backend/langflow/template/frontend_node/base.py
+++ b/src/backend/langflow/template/frontend_node/base.py
@@ -5,13 +5,14 @@ from typing import List, Optional
from pydantic import BaseModel, Field
from langflow.template.frontend_node.formatter import field_formatters
-from langflow.template.frontend_node.constants import FORCE_SHOW_FIELDS
+from langflow.template.frontend_node.constants import (
+ CLASSES_TO_REMOVE,
+ FORCE_SHOW_FIELDS,
+)
from langflow.template.field.base import TemplateField
from langflow.template.template.base import Template
from langflow.utils import constants
-CLASSES_TO_REMOVE = ["Serializable", "BaseModel", "object"]
-
class FieldFormatters(BaseModel):
formatters = {
@@ -51,14 +52,8 @@ class FrontendNode(BaseModel):
custom_fields: defaultdict = defaultdict(list)
output_types: List[str] = []
field_formatters: FieldFormatters = Field(default_factory=FieldFormatters)
-
- def process_base_classes(self) -> None:
- """Removes unwanted base classes from the list of base classes."""
- self.base_classes = [
- base_class
- for base_class in self.base_classes
- if base_class not in CLASSES_TO_REMOVE
- ]
+ beta: bool = False
+ error: Optional[str] = None
# field formatters is an instance attribute but it is not used in the class
# so we need to create a method to get it
@@ -70,6 +65,14 @@ class FrontendNode(BaseModel):
"""Sets the documentation of the frontend node."""
self.documentation = documentation
+ def process_base_classes(self) -> None:
+ """Removes unwanted base classes from the list of base classes."""
+ self.base_classes = [
+ base_class
+ for base_class in self.base_classes
+ if base_class not in CLASSES_TO_REMOVE
+ ]
+
def to_dict(self) -> dict:
"""Returns a dict representation of the frontend node."""
self.process_base_classes()
@@ -82,6 +85,8 @@ class FrontendNode(BaseModel):
"custom_fields": self.custom_fields,
"output_types": self.output_types,
"documentation": self.documentation,
+ "beta": self.beta,
+ "error": self.error,
},
}
diff --git a/src/backend/langflow/template/frontend_node/constants.py b/src/backend/langflow/template/frontend_node/constants.py
index 513ccd1ef..8800a3755 100644
--- a/src/backend/langflow/template/frontend_node/constants.py
+++ b/src/backend/langflow/template/frontend_node/constants.py
@@ -63,3 +63,6 @@ You can change this to use other APIs like JinaChat, LocalAI and Prem.
INPUT_KEY_INFO = """The variable to be used as Chat Input when more than one variable is available."""
OUTPUT_KEY_INFO = """The variable to be used as Chat Output (e.g. answer in a ConversationalRetrievalChain)"""
+
+
+CLASSES_TO_REMOVE = ["Serializable", "BaseModel", "object", "Runnable", "Generic"]
diff --git a/src/backend/langflow/template/frontend_node/custom_components.py b/src/backend/langflow/template/frontend_node/custom_components.py
new file mode 100644
index 000000000..4f36a1c9f
--- /dev/null
+++ b/src/backend/langflow/template/frontend_node/custom_components.py
@@ -0,0 +1,31 @@
+from langflow.template.field.base import TemplateField
+from langflow.template.frontend_node.base import FrontendNode
+from langflow.template.template.base import Template
+from langflow.interface.custom.constants import DEFAULT_CUSTOM_COMPONENT_CODE
+
+
+class CustomComponentFrontendNode(FrontendNode):
+ name: str = "CustomComponent"
+ display_name: str = "Custom Component"
+ beta: bool = True
+ template: Template = Template(
+ type_name="CustomComponent",
+ fields=[
+ TemplateField(
+ field_type="code",
+ required=True,
+ placeholder="",
+ is_list=False,
+ show=True,
+ value=DEFAULT_CUSTOM_COMPONENT_CODE,
+ name="code",
+ advanced=False,
+ dynamic=True,
+ )
+ ],
+ )
+ description: str = "Create any custom component you want!"
+ base_classes: list[str] = []
+
+ def to_dict(self):
+ return super().to_dict()
diff --git a/src/backend/langflow/template/frontend_node/documentloaders.py b/src/backend/langflow/template/frontend_node/documentloaders.py
index d775d8736..cdf67e54a 100644
--- a/src/backend/langflow/template/frontend_node/documentloaders.py
+++ b/src/backend/langflow/template/frontend_node/documentloaders.py
@@ -14,7 +14,7 @@ def build_file_field(
name=name,
value="",
suffixes=suffixes,
- fileTypes=fileTypes,
+ file_types=fileTypes,
)
@@ -30,7 +30,6 @@ class DocumentLoaderFrontNode(FrontendNode):
"UnstructuredEmailLoader": build_file_field(
suffixes=[".eml"], fileTypes=["eml"]
),
- "SlackDirectoryLoader": build_file_field(suffixes=[".zip"], fileTypes=["zip"]),
"EverNoteLoader": build_file_field(suffixes=[".xml"], fileTypes=["xml"]),
"FacebookChatLoader": build_file_field(suffixes=[".json"], fileTypes=["json"]),
"BSHTMLLoader": build_file_field(suffixes=[".html"], fileTypes=["html"]),
@@ -105,7 +104,30 @@ class DocumentLoaderFrontNode(FrontendNode):
advanced=False,
)
)
-
+ elif self.template.type_name in {"SlackDirectoryLoader"}:
+ self.template.add_field(
+ TemplateField(
+ field_type="file",
+ required=True,
+ show=True,
+ name="zip_path",
+ value="",
+ display_name="Path to zip file",
+ suffixes=[".zip"],
+ file_types=["zip"],
+ )
+ )
+ self.template.add_field(
+ TemplateField(
+ field_type="str",
+ required=False,
+ show=True,
+ name="workspace_url",
+ value="",
+ display_name="Workspace URL",
+ advanced=False,
+ )
+ )
elif self.template.type_name in self.file_path_templates:
self.template.add_field(self.file_path_templates[self.template.type_name])
elif self.template.type_name in {
diff --git a/src/backend/langflow/template/frontend_node/embeddings.py b/src/backend/langflow/template/frontend_node/embeddings.py
index d466b11a4..4e7e25112 100644
--- a/src/backend/langflow/template/frontend_node/embeddings.py
+++ b/src/backend/langflow/template/frontend_node/embeddings.py
@@ -5,6 +5,47 @@ from langflow.template.frontend_node.base import FrontendNode
class EmbeddingFrontendNode(FrontendNode):
+ def add_extra_fields(self) -> None:
+ if "VertexAI" in self.template.type_name:
+ # Add credentials field which should of type file.
+ self.template.add_field(
+ TemplateField(
+ field_type="file",
+ required=False,
+ show=True,
+ name="credentials",
+ value="",
+ suffixes=[".json"],
+ file_types=["json"],
+ )
+ )
+
+ @staticmethod
+ def format_vertex_field(field: TemplateField, name: str):
+ if "VertexAI" in name:
+ advanced_fields = [
+ "verbose",
+ "top_p",
+ "top_k",
+ "max_output_tokens",
+ ]
+ if field.name in advanced_fields:
+ field.advanced = True
+ show_fields = [
+ "verbose",
+ "project",
+ "location",
+ "credentials",
+ "max_output_tokens",
+ "model_name",
+ "temperature",
+ "top_p",
+ "top_k",
+ ]
+
+ if field.name in show_fields:
+ field.show = True
+
@staticmethod
def format_jina_fields(field: TemplateField):
if "jina" in field.name:
@@ -41,10 +82,36 @@ class EmbeddingFrontendNode(FrontendNode):
@staticmethod
def format_field(field: TemplateField, name: Optional[str] = None) -> None:
FrontendNode.format_field(field, name)
+ if name and "vertex" in name.lower():
+ EmbeddingFrontendNode.format_vertex_field(field, name)
field.advanced = not field.required
field.show = True
if field.name == "headers":
field.show = False
+ if field.name == "model_kwargs":
+ field.field_type = "code"
+ field.advanced = True
+ field.show = True
+ elif field.name in [
+ "model_name",
+ "temperature",
+ "model_file",
+ "model_type",
+ "deployment_name",
+ "credentials",
+ ]:
+ field.advanced = False
+ field.show = True
+ if field.name == "credentials":
+ field.field_type = "file"
+ if name == "VertexAI" and field.name not in [
+ "callbacks",
+ "client",
+ "stop",
+ "tags",
+ "cache",
+ ]:
+ field.show = True
# Format Jina fields
EmbeddingFrontendNode.format_jina_fields(field)
diff --git a/src/backend/langflow/template/frontend_node/llms.py b/src/backend/langflow/template/frontend_node/llms.py
index de0fa3c0b..a6a128cfe 100644
--- a/src/backend/langflow/template/frontend_node/llms.py
+++ b/src/backend/langflow/template/frontend_node/llms.py
@@ -19,7 +19,7 @@ class LLMFrontendNode(FrontendNode):
name="credentials",
value="",
suffixes=[".json"],
- fileTypes=["json"],
+ file_types=["json"],
)
)
diff --git a/src/backend/langflow/template/frontend_node/memories.py b/src/backend/langflow/template/frontend_node/memories.py
index 374d36ff0..019dc0fa8 100644
--- a/src/backend/langflow/template/frontend_node/memories.py
+++ b/src/backend/langflow/template/frontend_node/memories.py
@@ -94,6 +94,14 @@ class MemoryFrontendNode(FrontendNode):
field.show = False
field.required = False
+ if name == "MotorheadMemory":
+ if field.name == "chat_memory":
+ field.show = False
+ field.required = False
+ elif field.name == "client_id":
+ field.show = True
+ field.advanced = False
+
class PostgresChatMessageHistoryFrontendNode(MemoryFrontendNode):
name: str = "PostgresChatMessageHistory"
diff --git a/src/backend/langflow/template/frontend_node/tools.py b/src/backend/langflow/template/frontend_node/tools.py
index ece765ed7..579b32da3 100644
--- a/src/backend/langflow/template/frontend_node/tools.py
+++ b/src/backend/langflow/template/frontend_node/tools.py
@@ -1,7 +1,9 @@
from langflow.template.field.base import TemplateField
from langflow.template.frontend_node.base import FrontendNode
from langflow.template.template.base import Template
-from langflow.utils.constants import DEFAULT_PYTHON_FUNCTION
+from langflow.utils.constants import (
+ DEFAULT_PYTHON_FUNCTION,
+)
class ToolNode(FrontendNode):
diff --git a/src/backend/langflow/template/frontend_node/utilities.py b/src/backend/langflow/template/frontend_node/utilities.py
index 615d7d12f..df993e377 100644
--- a/src/backend/langflow/template/frontend_node/utilities.py
+++ b/src/backend/langflow/template/frontend_node/utilities.py
@@ -12,8 +12,11 @@ class UtilitiesFrontendNode(FrontendNode):
FrontendNode.format_field(field, name)
# field.field_type could be "Literal['news', 'search', 'places', 'images']
# we need to convert it to a list
+ # It seems it could also be like "typing_extensions.['news', 'search', 'places', 'images']"
if "Literal" in field.field_type:
- field.options = ast.literal_eval(field.field_type.replace("Literal", ""))
+ field_type = field.field_type.replace("typing_extensions.", "")
+ field_type = field_type.replace("Literal", "")
+ field.options = ast.literal_eval(field_type)
field.is_list = True
field.field_type = "str"
diff --git a/src/backend/langflow/template/frontend_node/vectorstores.py b/src/backend/langflow/template/frontend_node/vectorstores.py
index 53a840b80..23c293437 100644
--- a/src/backend/langflow/template/frontend_node/vectorstores.py
+++ b/src/backend/langflow/template/frontend_node/vectorstores.py
@@ -4,6 +4,52 @@ from langflow.template.field.base import TemplateField
from langflow.template.frontend_node.base import FrontendNode
+BASIC_FIELDS = [
+ "work_dir",
+ "collection_name",
+ "api_key",
+ "location",
+ "persist_directory",
+ "persist",
+ "weaviate_url",
+ "index_name",
+ "namespace",
+ "folder_path",
+ "table_name",
+ "query_name",
+ "supabase_url",
+ "supabase_service_key",
+ "mongodb_atlas_cluster_uri",
+ "collection_name",
+ "db_name",
+]
+ADVANCED_FIELDS = [
+ "n_dim",
+ "key",
+ "prefix",
+ "distance_func",
+ "content_payload_key",
+ "metadata_payload_key",
+ "timeout",
+ "host",
+ "path",
+ "url",
+ "port",
+ "https",
+ "prefer_grpc",
+ "grpc_port",
+ "pinecone_api_key",
+ "pinecone_env",
+ "client_kwargs",
+ "search_kwargs",
+ "chroma_server_host",
+ "chroma_server_http_port",
+ "chroma_server_ssl_enabled",
+ "chroma_server_grpc_port",
+ "chroma_server_cors_allow_origins",
+]
+
+
class VectorStoreFrontendNode(FrontendNode):
def add_extra_fields(self) -> None:
extra_fields: List[TemplateField] = []
@@ -45,16 +91,62 @@ class VectorStoreFrontendNode(FrontendNode):
elif self.template.type_name == "Chroma":
# New bool field for persist parameter
- extra_field = TemplateField(
- name="persist",
- field_type="bool",
- required=False,
- show=True,
- advanced=False,
- value=False,
- display_name="Persist",
- )
- extra_fields.append(extra_field)
+ chroma_fields = [
+ TemplateField(
+ name="persist",
+ field_type="bool",
+ required=False,
+ show=True,
+ advanced=False,
+ value=False,
+ display_name="Persist",
+ ),
+ # chroma_server_grpc_port: str | None = None,
+ TemplateField(
+ name="chroma_server_host",
+ field_type="str",
+ required=False,
+ show=True,
+ advanced=True,
+ display_name="Chroma Server Host",
+ ),
+ TemplateField(
+ name="chroma_server_http_port",
+ field_type="str",
+ required=False,
+ show=True,
+ advanced=True,
+ display_name="Chroma Server HTTP Port",
+ ),
+ TemplateField(
+ name="chroma_server_ssl_enabled",
+ field_type="bool",
+ required=False,
+ show=True,
+ advanced=True,
+ value=False,
+ display_name="Chroma Server SSL Enabled",
+ ),
+ TemplateField(
+ name="chroma_server_grpc_port",
+ field_type="str",
+ required=False,
+ show=True,
+ advanced=True,
+ display_name="Chroma Server GRPC Port",
+ ),
+ TemplateField(
+ name="chroma_server_cors_allow_origins",
+ field_type="str",
+ required=False,
+ is_list=True,
+ show=True,
+ advanced=True,
+ display_name="Chroma Server CORS Allow Origins",
+ ),
+ ]
+
+ extra_fields.extend(chroma_fields)
elif self.template.type_name == "Pinecone":
# add pinecone_api_key and pinecone_env
extra_field = TemplateField(
@@ -208,45 +300,6 @@ class VectorStoreFrontendNode(FrontendNode):
def format_field(field: TemplateField, name: Optional[str] = None) -> None:
FrontendNode.format_field(field, name)
# Define common field attributes
- basic_fields = [
- "work_dir",
- "collection_name",
- "api_key",
- "location",
- "persist_directory",
- "persist",
- "weaviate_url",
- "index_name",
- "namespace",
- "folder_path",
- "table_name",
- "query_name",
- "supabase_url",
- "supabase_service_key",
- "mongodb_atlas_cluster_uri",
- "collection_name",
- "db_name",
- ]
- advanced_fields = [
- "n_dim",
- "key",
- "prefix",
- "distance_func",
- "content_payload_key",
- "metadata_payload_key",
- "timeout",
- "host",
- "path",
- "url",
- "port",
- "https",
- "prefer_grpc",
- "grpc_port",
- "pinecone_api_key",
- "pinecone_env",
- "client_kwargs",
- "search_kwargs",
- ]
# Check and set field attributes
if field.name == "texts":
@@ -269,7 +322,7 @@ class VectorStoreFrontendNode(FrontendNode):
field.display_name = "Embedding"
field.field_type = "Embeddings"
- elif field.name in basic_fields:
+ elif field.name in BASIC_FIELDS:
field.show = True
field.advanced = False
if field.name == "api_key":
@@ -279,7 +332,7 @@ class VectorStoreFrontendNode(FrontendNode):
field.value = ":memory:"
field.placeholder = ":memory:"
- elif field.name in advanced_fields:
+ elif field.name in ADVANCED_FIELDS:
field.show = True
field.advanced = True
if "key" in field.name:
diff --git a/src/backend/langflow/utils/constants.py b/src/backend/langflow/utils/constants.py
index 44103c2b7..e473d855b 100644
--- a/src/backend/langflow/utils/constants.py
+++ b/src/backend/langflow/utils/constants.py
@@ -17,18 +17,29 @@ CHAT_OPENAI_MODELS = [
]
ANTHROPIC_MODELS = [
- "claude-v1", # largest model, ideal for a wide range of more complex tasks.
- "claude-v1-100k", # An enhanced version of claude-v1 with a 100,000 token (roughly 75,000 word) context window.
- "claude-instant-v1", # A smaller model with far lower latency, sampling at roughly 40 words/sec!
- "claude-instant-v1-100k", # Like claude-instant-v1 with a 100,000 token context window but retains its performance.
+ # largest model, ideal for a wide range of more complex tasks.
+ "claude-v1",
+ # An enhanced version of claude-v1 with a 100,000 token (roughly 75,000 word) context window.
+ "claude-v1-100k",
+ # A smaller model with far lower latency, sampling at roughly 40 words/sec!
+ "claude-instant-v1",
+ # Like claude-instant-v1 with a 100,000 token context window but retains its performance.
+ "claude-instant-v1-100k",
# Specific sub-versions of the above models:
- "claude-v1.3", # Vs claude-v1.2: better instruction-following, code, and non-English dialogue and writing.
- "claude-v1.3-100k", # An enhanced version of claude-v1.3 with a 100,000 token (roughly 75,000 word) context window.
- "claude-v1.2", # Vs claude-v1.1: small adv in general helpfulness, instruction following, coding, and other tasks.
- "claude-v1.0", # An earlier version of claude-v1.
- "claude-instant-v1.1", # Latest version of claude-instant-v1. Better than claude-instant-v1.0 at most tasks.
- "claude-instant-v1.1-100k", # Version of claude-instant-v1.1 with a 100K token context window.
- "claude-instant-v1.0", # An earlier version of claude-instant-v1.
+ # Vs claude-v1.2: better instruction-following, code, and non-English dialogue and writing.
+ "claude-v1.3",
+ # An enhanced version of claude-v1.3 with a 100,000 token (roughly 75,000 word) context window.
+ "claude-v1.3-100k",
+ # Vs claude-v1.1: small adv in general helpfulness, instruction following, coding, and other tasks.
+ "claude-v1.2",
+ # An earlier version of claude-v1.
+ "claude-v1.0",
+ # Latest version of claude-instant-v1. Better than claude-instant-v1.0 at most tasks.
+ "claude-instant-v1.1",
+ # Version of claude-instant-v1.1 with a 100K token context window.
+ "claude-instant-v1.1-100k",
+ # An earlier version of claude-instant-v1.
+ "claude-instant-v1.0",
]
DEFAULT_PYTHON_FUNCTION = """
@@ -36,4 +47,5 @@ def python_function(text: str) -> str:
\"\"\"This is a default python function that returns the input text\"\"\"
return text
"""
+
DIRECT_TYPES = ["str", "bool", "code", "int", "float", "Any", "prompt"]
diff --git a/src/backend/langflow/utils/lazy_load.py b/src/backend/langflow/utils/lazy_load.py
new file mode 100644
index 000000000..df0130acc
--- /dev/null
+++ b/src/backend/langflow/utils/lazy_load.py
@@ -0,0 +1,15 @@
+class LazyLoadDictBase:
+ def __init__(self):
+ self._all_types_dict = None
+
+ @property
+ def all_types_dict(self):
+ if self._all_types_dict is None:
+ self._all_types_dict = self._build_dict()
+ return self._all_types_dict
+
+ def _build_dict(self):
+ raise NotImplementedError
+
+ def get_type_dict(self):
+ raise NotImplementedError
diff --git a/src/backend/langflow/utils/logger.py b/src/backend/langflow/utils/logger.py
index b70a451d4..deb0f75ca 100644
--- a/src/backend/langflow/utils/logger.py
+++ b/src/backend/langflow/utils/logger.py
@@ -6,7 +6,7 @@ from rich.logging import RichHandler
logger = logging.getLogger("langflow")
-def configure(log_level: str = "INFO", log_file: Path = None): # type: ignore
+def configure(log_level: str = "DEBUG", log_file: Path = None): # type: ignore
log_format = "%(asctime)s - %(levelname)s - %(message)s"
log_level_value = getattr(logging, log_level.upper(), logging.INFO)
diff --git a/src/backend/langflow/utils/types.py b/src/backend/langflow/utils/types.py
new file mode 100644
index 000000000..3657d550e
--- /dev/null
+++ b/src/backend/langflow/utils/types.py
@@ -0,0 +1,2 @@
+class Prompt:
+ pass
diff --git a/src/backend/langflow/utils/util.py b/src/backend/langflow/utils/util.py
index c5db6052e..f68c9dbe2 100644
--- a/src/backend/langflow/utils/util.py
+++ b/src/backend/langflow/utils/util.py
@@ -1,13 +1,15 @@
-import importlib
-import inspect
import re
+import inspect
+import importlib
from functools import wraps
-from typing import Dict, Optional
+from typing import Optional, Dict, Any, Union
from docstring_parser import parse # type: ignore
from langflow.template.frontend_node.constants import FORCE_SHOW_FIELDS
from langflow.utils import constants
+from langflow.utils.logger import logger
+from multiprocess import cpu_count # type: ignore
def build_template_from_function(
@@ -214,111 +216,6 @@ def get_default_factory(module: str, function: str):
return None
-def format_dict(d, name: Optional[str] = None):
- """
- Formats a dictionary by removing certain keys and modifying the
- values of other keys.
-
- Args:
- d: the dictionary to format
- name: the name of the class to format
-
- Returns:
- A new dictionary with the desired modifications applied.
- """
-
- # Process remaining keys
- for key, value in d.items():
- if key == "_type":
- continue
-
- _type = value["type"]
-
- if not isinstance(_type, str):
- _type = _type.__name__
-
- # Remove 'Optional' wrapper
- if "Optional" in _type:
- _type = _type.replace("Optional[", "")[:-1]
-
- # Check for list type
- if "List" in _type or "Sequence" in _type or "Set" in _type:
- _type = (
- _type.replace("List[", "")
- .replace("Sequence[", "")
- .replace("Set[", "")[:-1]
- )
- value["list"] = True
- else:
- value["list"] = False
-
- # Replace 'Mapping' with 'dict'
- if "Mapping" in _type:
- _type = _type.replace("Mapping", "dict")
-
- # Change type from str to Tool
- value["type"] = "Tool" if key in ["allowed_tools"] else _type
-
- value["type"] = "int" if key in ["max_value_length"] else value["type"]
-
- # Show or not field
- value["show"] = bool(
- (value["required"] and key not in ["input_variables"])
- or key in FORCE_SHOW_FIELDS
- or "api_key" in key
- )
-
- # Add password field
- value["password"] = any(
- text in key.lower() for text in ["password", "token", "api", "key"]
- )
-
- # Add multline
- value["multiline"] = key in [
- "suffix",
- "prefix",
- "template",
- "examples",
- "code",
- "headers",
- "format_instructions",
- ]
-
- # Replace dict type with str
- if "dict" in value["type"].lower():
- value["type"] = "code"
-
- if key == "dict_":
- value["type"] = "file"
- value["suffixes"] = [".json", ".yaml", ".yml"]
- value["fileTypes"] = ["json", "yaml", "yml"]
-
- # Replace default value with actual value
- if "default" in value:
- value["value"] = value["default"]
- value.pop("default")
-
- if key == "headers":
- value[
- "value"
- ] = """{'Authorization':
- 'Bearer '}"""
- # Add options to openai
- if name == "OpenAI" and key == "model_name":
- value["options"] = constants.OPENAI_MODELS
- value["list"] = True
- value["value"] = constants.OPENAI_MODELS[0]
- elif name == "ChatOpenAI" and key == "model_name":
- value["options"] = constants.CHAT_OPENAI_MODELS
- value["list"] = True
- value["value"] = constants.CHAT_OPENAI_MODELS[0]
- elif (name == "Anthropic" or name == "ChatAnthropic") and key == "model_name":
- value["options"] = constants.ANTHROPIC_MODELS
- value["list"] = True
- value["value"] = constants.ANTHROPIC_MODELS[0]
- return d
-
-
def update_verbose(d: dict, new_value: bool) -> dict:
"""
Recursively updates the value of the 'verbose' key in a dictionary.
@@ -349,3 +246,219 @@ def sync_to_async(func):
return func(*args, **kwargs)
return async_wrapper
+
+
+def format_dict(
+ dictionary: Dict[str, Any], class_name: Optional[str] = None
+) -> Dict[str, Any]:
+ """
+ Formats a dictionary by removing certain keys and modifying the
+ values of other keys.
+
+ Returns:
+ A new dictionary with the desired modifications applied.
+ """
+
+ for key, value in dictionary.items():
+ if key == "_type":
+ continue
+
+ _type: Union[str, type] = get_type(value)
+
+ _type = remove_optional_wrapper(_type)
+ _type = check_list_type(_type, value)
+ _type = replace_mapping_with_dict(_type)
+
+ value["type"] = get_formatted_type(key, _type)
+ value["show"] = should_show_field(value, key)
+ value["password"] = is_password_field(key)
+ value["multiline"] = is_multiline_field(key)
+
+ replace_dict_type_with_code(value)
+
+ if key == "dict_":
+ set_dict_file_attributes(value)
+
+ replace_default_value_with_actual(value)
+
+ if key == "headers":
+ set_headers_value(value)
+
+ add_options_to_field(value, class_name, key)
+
+ return dictionary
+
+
+def get_type(value: Any) -> Union[str, type]:
+ """
+ Retrieves the type value from the dictionary.
+
+ Returns:
+ The type value.
+ """
+ _type = value["type"]
+
+ return _type if isinstance(_type, str) else _type.__name__
+
+
+def remove_optional_wrapper(_type: Union[str, type]) -> str:
+ """
+ Removes the 'Optional' wrapper from the type string.
+
+ Returns:
+ The type string with the 'Optional' wrapper removed.
+ """
+ if isinstance(_type, type):
+ _type = str(_type)
+ if "Optional" in _type:
+ _type = _type.replace("Optional[", "")[:-1]
+
+ return _type
+
+
+def check_list_type(_type: str, value: Dict[str, Any]) -> str:
+ """
+ Checks if the type is a list type and modifies the value accordingly.
+
+ Returns:
+ The modified type string.
+ """
+ if any(list_type in _type for list_type in ["List", "Sequence", "Set"]):
+ _type = (
+ _type.replace("List[", "").replace("Sequence[", "").replace("Set[", "")[:-1]
+ )
+ value["list"] = True
+ else:
+ value["list"] = False
+
+ return _type
+
+
+def replace_mapping_with_dict(_type: str) -> str:
+ """
+ Replaces 'Mapping' with 'dict' in the type string.
+
+ Returns:
+ The modified type string.
+ """
+ if "Mapping" in _type:
+ _type = _type.replace("Mapping", "dict")
+
+ return _type
+
+
+def get_formatted_type(key: str, _type: str) -> str:
+ """
+ Formats the type value based on the given key.
+
+ Returns:
+ The formatted type value.
+ """
+ if key == "allowed_tools":
+ return "Tool"
+
+ elif key == "max_value_length":
+ return "int"
+
+ return _type
+
+
+def should_show_field(value: Dict[str, Any], key: str) -> bool:
+ """
+ Determines if the field should be shown or not.
+
+ Returns:
+ True if the field should be shown, False otherwise.
+ """
+ return (
+ (value["required"] and key != "input_variables")
+ or key in FORCE_SHOW_FIELDS
+ or any(text in key.lower() for text in ["password", "token", "api", "key"])
+ )
+
+
+def is_password_field(key: str) -> bool:
+ """
+ Determines if the field is a password field.
+
+ Returns:
+ True if the field is a password field, False otherwise.
+ """
+ return any(text in key.lower() for text in ["password", "token", "api", "key"])
+
+
+def is_multiline_field(key: str) -> bool:
+ """
+ Determines if the field is a multiline field.
+
+ Returns:
+ True if the field is a multiline field, False otherwise.
+ """
+ return key in {
+ "suffix",
+ "prefix",
+ "template",
+ "examples",
+ "code",
+ "headers",
+ "format_instructions",
+ }
+
+
+def replace_dict_type_with_code(value: Dict[str, Any]) -> None:
+ """
+ Replaces the type value with 'code' if the type is a dict.
+ """
+ if "dict" in value["type"].lower():
+ value["type"] = "code"
+
+
+def set_dict_file_attributes(value: Dict[str, Any]) -> None:
+ """
+ Sets the file attributes for the 'dict_' key.
+ """
+ value["type"] = "file"
+ value["suffixes"] = [".json", ".yaml", ".yml"]
+ value["fileTypes"] = ["json", "yaml", "yml"]
+
+
+def replace_default_value_with_actual(value: Dict[str, Any]) -> None:
+ """
+ Replaces the default value with the actual value.
+ """
+ if "default" in value:
+ value["value"] = value["default"]
+ value.pop("default")
+
+
+def set_headers_value(value: Dict[str, Any]) -> None:
+ """
+ Sets the value for the 'headers' key.
+ """
+ value["value"] = """{'Authorization': 'Bearer '}"""
+
+
+def add_options_to_field(
+ value: Dict[str, Any], class_name: Optional[str], key: str
+) -> None:
+ """
+ Adds options to the field based on the class name and key.
+ """
+ options_map = {
+ "OpenAI": constants.OPENAI_MODELS,
+ "ChatOpenAI": constants.CHAT_OPENAI_MODELS,
+ "Anthropic": constants.ANTHROPIC_MODELS,
+ "ChatAnthropic": constants.ANTHROPIC_MODELS,
+ }
+
+ if class_name in options_map and key == "model_name":
+ value["options"] = options_map[class_name]
+ value["list"] = True
+ value["value"] = options_map[class_name][0]
+
+
+def get_number_of_workers(workers=None):
+ if workers == -1 or workers is None:
+ workers = (cpu_count() * 2) + 1
+ logger.debug(f"Number of workers: {workers}")
+ return workers
diff --git a/src/backend/langflow/utils/validate.py b/src/backend/langflow/utils/validate.py
index 905b9dd44..f8a9c1d1d 100644
--- a/src/backend/langflow/utils/validate.py
+++ b/src/backend/langflow/utils/validate.py
@@ -163,9 +163,77 @@ def create_function(code, function_name):
return wrapped_function
+def create_class(code, class_name):
+ if not hasattr(ast, "TypeIgnore"):
+
+ class TypeIgnore(ast.AST):
+ _fields = ()
+
+ ast.TypeIgnore = TypeIgnore
+
+ module = ast.parse(code)
+ exec_globals = globals().copy()
+
+ for node in module.body:
+ if isinstance(node, ast.Import):
+ for alias in node.names:
+ try:
+ exec_globals[alias.asname or alias.name] = importlib.import_module(
+ alias.name
+ )
+ except ModuleNotFoundError as e:
+ raise ModuleNotFoundError(
+ f"Module {alias.name} not found. Please install it and try again."
+ ) from e
+ elif isinstance(node, ast.ImportFrom):
+ try:
+ imported_module = importlib.import_module(node.module)
+ for alias in node.names:
+ exec_globals[alias.name] = getattr(imported_module, alias.name)
+ except ModuleNotFoundError as e:
+ raise ModuleNotFoundError(
+ f"Module {node.module} not found. Please install it and try again."
+ ) from e
+
+ class_code = next(
+ node
+ for node in module.body
+ if isinstance(node, ast.ClassDef) and node.name == class_name
+ )
+ class_code.parent = None
+ code_obj = compile(
+ ast.Module(body=[class_code], type_ignores=[]), "", "exec"
+ )
+ # This suppresses import errors
+ # with contextlib.suppress(Exception):
+ exec(code_obj, exec_globals, locals())
+ exec_globals[class_name] = locals()[class_name]
+
+ # Return a function that imports necessary modules and creates an instance of the target class
+ def build_my_class(*args, **kwargs):
+ for module_name, module in exec_globals.items():
+ if isinstance(module, type(importlib)):
+ globals()[module_name] = module
+
+ instance = exec_globals[class_name](*args, **kwargs)
+ return instance
+
+ build_my_class.__globals__.update(exec_globals)
+
+ return build_my_class
+
+
def extract_function_name(code):
module = ast.parse(code)
for node in module.body:
if isinstance(node, ast.FunctionDef):
return node.name
raise ValueError("No function definition found in the code string")
+
+
+def extract_class_name(code):
+ module = ast.parse(code)
+ for node in module.body:
+ if isinstance(node, ast.ClassDef):
+ return node.name
+ raise ValueError("No class definition found in the code string")
diff --git a/src/frontend/.dockerignore b/src/frontend/.dockerignore
new file mode 100644
index 000000000..ca5762007
--- /dev/null
+++ b/src/frontend/.dockerignore
@@ -0,0 +1,2 @@
+**/node_modules
+**/build
\ No newline at end of file
diff --git a/src/frontend/src/CustomNodes/GenericNode/components/parameterComponent/index.tsx b/src/frontend/src/CustomNodes/GenericNode/components/parameterComponent/index.tsx
index cfa9b8b92..f25dd5185 100644
--- a/src/frontend/src/CustomNodes/GenericNode/components/parameterComponent/index.tsx
+++ b/src/frontend/src/CustomNodes/GenericNode/components/parameterComponent/index.tsx
@@ -1,5 +1,11 @@
import { cloneDeep } from "lodash";
-import React, { useContext, useEffect, useRef, useState } from "react";
+import React, {
+ ReactNode,
+ useContext,
+ useEffect,
+ useRef,
+ useState,
+} from "react";
import { Handle, Position, useUpdateNodeInternals } from "reactflow";
import ShadTooltip from "../../../../components/ShadTooltipComponent";
import CodeAreaComponent from "../../../../components/codeAreaComponent";
@@ -13,21 +19,18 @@ import IntComponent from "../../../../components/intComponent";
import PromptAreaComponent from "../../../../components/promptComponent";
import TextAreaComponent from "../../../../components/textAreaComponent";
import ToggleShadComponent from "../../../../components/toggleShadComponent";
-import { MAX_LENGTH_TO_SCROLL_TOOLTIP } from "../../../../constants/constants";
+import { TOOLTIP_EMPTY } from "../../../../constants/constants";
import { TabsContext } from "../../../../contexts/tabsContext";
import { typesContext } from "../../../../contexts/typesContext";
import { ParameterComponentType } from "../../../../types/components";
+import { TabsState } from "../../../../types/tabs";
import { isValidConnection } from "../../../../utils/reactflowUtils";
import {
nodeColors,
nodeIconsLucide,
nodeNames,
} from "../../../../utils/styleUtils";
-import {
- classNames,
- getRandomKeyByssmm,
- groupByFamily,
-} from "../../../../utils/utils";
+import { classNames, groupByFamily } from "../../../../utils/utils";
export default function ParameterComponent({
left,
@@ -42,14 +45,15 @@ export default function ParameterComponent({
required = false,
optionalHandle = null,
info = "",
-}: ParameterComponentType) {
- const ref = useRef(null);
- const refHtml = useRef(null);
- const refNumberComponents = useRef(0);
- const infoHtml = useRef(null);
+}: ParameterComponentType): JSX.Element {
+ const ref = useRef(null);
+ const refHtml = useRef(null);
+ const infoHtml = useRef(null);
const updateNodeInternals = useUpdateNodeInternals();
const [position, setPosition] = useState(0);
- const { setTabsState, tabId, save } = useContext(TabsContext);
+ const { setTabsState, tabId, save, flows } = useContext(TabsContext);
+
+ const flow = flows.find((flow) => flow.id === tabId)?.data?.nodes ?? null;
// Update component position
useEffect(() => {
@@ -65,32 +69,37 @@ export default function ParameterComponent({
const { reactFlowInstance } = useContext(typesContext);
let disabled =
- reactFlowInstance?.getEdges().some((e) => e.targetHandle === id) ?? false;
+ reactFlowInstance?.getEdges().some((edge) => edge.targetHandle === id) ??
+ false;
const { data: myData } = useContext(typesContext);
- const handleOnNewValue = (newValue: any) => {
+ const handleOnNewValue = (newValue: string | string[] | boolean): void => {
let newData = cloneDeep(data);
- newData.node.template[name].value = newValue;
+ newData.node!.template[name].value = newValue;
setData(newData);
// Set state to pending
- setTabsState((prev) => {
+ //@ts-ignore
+ setTabsState((prev: TabsState) => {
return {
...prev,
[tabId]: {
...prev[tabId],
isPending: true,
+ formKeysData: prev[tabId].formKeysData,
},
};
});
+ renderTooltips();
};
useEffect(() => {
if (name === "openai_api_base") console.log(info);
+ // @ts-ignore
infoHtml.current = (
- {info.split("\n").map((line, i) => (
-
+ {info.split("\n").map((line, index) => (
+
{line}
))}
@@ -98,57 +107,67 @@ export default function ParameterComponent({
);
}, [info]);
- useEffect(() => {
- const groupedObj = groupByFamily(myData, tooltipTitle, left, data.type);
+ function renderTooltips() {
+ let groupedObj = groupByFamily(myData, tooltipTitle!, left, flow!);
- refNumberComponents.current = groupedObj[0]?.type?.length;
+ if (groupedObj && groupedObj.length > 0) {
+ //@ts-ignore
+ refHtml.current = groupedObj.map((item, index) => {
+ const Icon: any =
+ nodeIconsLucide[item.family] ?? nodeIconsLucide["unknown"];
- refHtml.current = groupedObj.map((item, i) => {
- const Icon: any = nodeIconsLucide[item.family];
-
- return (
-
0 ? "mt-2 flex items-center" : "flex items-center"
- )}
- >
- 0 ? "mt-2 flex items-center" : "flex items-center"
+ )}
>
-
-
-
- {nodeNames[item.family] ?? ""}{" "}
-
- {" "}
- {item.type === "" ? "" : " - "}
- {item.type.split(", ").length > 2
- ? item.type.split(", ").map((el, i) => (
-
-
- {i === item.type.split(", ").length - 1
- ? el
- : (el += `, `)}
-
-
- ))
- : item.type}
+ >
+
+
+
+ {nodeNames[item.family] ?? "Other"}
+
+ {" "}
+ {item.type === "" ? "" : " - "}
+ {item.type.split(", ").length > 2
+ ? item.type.split(", ").map((el, index) => (
+
+
+ {index === item.type.split(", ").length - 1
+ ? el
+ : (el += `, `)}
+
+
+ ))
+ : item.type}
+
-
- );
- });
- }, [tooltipTitle]);
+ );
+ });
+ } else {
+ //@ts-ignore
+ refHtml.current = {TOOLTIP_EMPTY} ;
+ }
+ }
+
+ useEffect(() => {
+ renderTooltips();
+ }, [tooltipTitle, flow]);
+
return (
>
) : (
MAX_LENGTH_TO_SCROLL_TOOLTIP
- ? "tooltip-fixed-width custom-scroll overflow-y-scroll nowheel"
- : "tooltip-fixed-width"
- }
+ styleClasses={"tooltip-fixed-width custom-scroll nowheel"}
delayDuration={0}
content={refHtml.current}
side={left ? "left" : "right"}
@@ -204,7 +219,7 @@ export default function ParameterComponent({
position={left ? Position.Left : Position.Right}
id={id}
isValidConnection={(connection) =>
- isValidConnection(connection, reactFlowInstance)
+ isValidConnection(connection, reactFlowInstance!)
}
className={classNames(
left ? "-ml-0.5 " : "-mr-0.5 ",
@@ -220,9 +235,9 @@ export default function ParameterComponent({
{left === true &&
type === "str" &&
- !data.node.template[name].options ? (
+ !data.node?.template[name].options ? (
- {data.node.template[name].list ? (
+ {data.node?.template[name].list ? (
- ) : data.node.template[name].multiline ? (
+ ) : data.node?.template[name].multiline ? (
)}
@@ -252,9 +267,9 @@ export default function ParameterComponent({
{
- handleOnNewValue(t);
+ enabled={data.node?.template[name].value ?? false}
+ setEnabled={(isEnabled) => {
+ handleOnNewValue(isEnabled);
}}
size="large"
/>
@@ -263,13 +278,13 @@ export default function ParameterComponent({
) : left === true &&
type === "str" &&
- data.node.template[name].options ? (
+ data.node?.template[name].options ? (
{
data.node = nodeClass;
}}
nodeClass={data.node}
disabled={disabled}
- value={data.node.template[name].value ?? ""}
+ value={data.node?.template[name].value ?? ""}
onChange={handleOnNewValue}
/>
@@ -293,12 +309,12 @@ export default function ParameterComponent({
{
- data.node.template[name].file_path = t;
+ fileTypes={data.node?.template[name].fileTypes}
+ suffixes={data.node?.template[name].suffixes}
+ onFileChange={(filePath: string) => {
+ data.node!.template[name].file_path = filePath;
save();
}}
>
@@ -307,7 +323,7 @@ export default function ParameterComponent({
@@ -320,7 +336,7 @@ export default function ParameterComponent({
}}
nodeClass={data.node}
disabled={disabled}
- value={data.node.template[name].value ?? ""}
+ value={data.node?.template[name].value ?? ""}
onChange={handleOnNewValue}
/>
diff --git a/src/frontend/src/CustomNodes/GenericNode/index.tsx b/src/frontend/src/CustomNodes/GenericNode/index.tsx
index 1aff91f0b..690ace7af 100644
--- a/src/frontend/src/CustomNodes/GenericNode/index.tsx
+++ b/src/frontend/src/CustomNodes/GenericNode/index.tsx
@@ -1,11 +1,10 @@
import { cloneDeep } from "lodash";
-import { useContext, useEffect, useRef, useState } from "react";
+import { useContext, useEffect, useState } from "react";
import { NodeToolbar, useUpdateNodeInternals } from "reactflow";
import ShadTooltip from "../../components/ShadTooltipComponent";
import Tooltip from "../../components/TooltipComponent";
import IconComponent from "../../components/genericIconComponent";
import { useSSE } from "../../contexts/SSEContext";
-import { alertContext } from "../../contexts/alertContext";
import { TabsContext } from "../../contexts/tabsContext";
import { typesContext } from "../../contexts/typesContext";
import NodeToolbarComponent from "../../pages/FlowPage/components/nodeToolbarComponent";
@@ -23,14 +22,9 @@ export default function GenericNode({
selected: boolean;
}) {
const [data, setData] = useState(olddata);
- const { setErrorData } = useContext(alertContext);
const { updateFlow, flows, tabId } = useContext(TabsContext);
const updateNodeInternals = useUpdateNodeInternals();
- const showError = useRef(true);
const { types, deleteNode, reactFlowInstance } = useContext(typesContext);
- // any to avoid type conflict
- const Icon: any =
- nodeIconsLucide[data.type] || nodeIconsLucide[types[data.type]];
const name = nodeIconsLucide[data.type] ? data.type : types[data.type];
const [validationStatus, setValidationStatus] = useState(null);
// State for outline color
@@ -67,18 +61,6 @@ export default function GenericNode({
}
}, [sseData, data.id]);
- if (!Icon) {
- if (showError.current) {
- setErrorData({
- title: data.type
- ? `The ${data.type} node could not be rendered, please review your json file`
- : "There was a node that can't be rendered, please review your json file",
- });
- showError.current = false;
- }
- deleteNode(data.id);
- return;
- }
return (
<>
@@ -95,6 +77,11 @@ export default function GenericNode({
"generic-node-div"
)}
>
+ {data.node.beta && (
+
+ )}
) : (
- {validationStatus.params
+ {typeof validationStatus.params === "string"
? validationStatus.params
.split("\n")
.map((line, index) =>
{line}
)
@@ -172,44 +159,59 @@ export default function GenericNode({
<>
{Object.keys(data.node.template)
- .filter((t) => t.charAt(0) !== "_")
- .map((t: string, idx) => (
+ .filter((templateField) => templateField.charAt(0) !== "_")
+ .map((templateField: string, idx) => (
- {data.node.template[t].show &&
- !data.node.template[t].advanced ? (
+ {data.node.template[templateField].show &&
+ !data.node.template[templateField].advanced ? (
) : (
<>>
@@ -225,6 +227,7 @@ export default function GenericNode({
{" "}
-
+
{dropItem.title}
{dropItem.list ? (
{dropItem.list.map((item, idx) => (
-
+
{item}
))}
diff --git a/src/frontend/src/alerts/alertDropDown/index.tsx b/src/frontend/src/alerts/alertDropDown/index.tsx
index 90838d693..59f42d3ec 100644
--- a/src/frontend/src/alerts/alertDropDown/index.tsx
+++ b/src/frontend/src/alerts/alertDropDown/index.tsx
@@ -22,9 +22,9 @@ export default function AlertDropdown({ children }: AlertDropdownType) {
return (
{
- setOpen(k);
- if (k) setNotificationCenter(false);
+ onOpenChange={(target) => {
+ setOpen(target);
+ if (target) setNotificationCenter(false);
}}
>
{children}
diff --git a/src/frontend/src/alerts/notice/index.tsx b/src/frontend/src/alerts/notice/index.tsx
index 494d9d4dc..1acd5c898 100644
--- a/src/frontend/src/alerts/notice/index.tsx
+++ b/src/frontend/src/alerts/notice/index.tsx
@@ -47,7 +47,9 @@ export default function NoticeAlert({
/>
-
{title}
+
+ {title}
+
{link !== "" ? (
;
tabId: string;
+ invalidName: boolean;
+ setInvalidName: (invalidName: boolean) => void;
setName: (name: string) => void;
setDescription: (description: string) => void;
updateFlow: (flow: { id: string; name: string }) => void;
@@ -16,6 +19,8 @@ type InputProps = {
export const EditFlowSettings: React.FC = ({
name,
+ invalidName,
+ setInvalidName,
description,
maxLength = 50,
flows,
@@ -25,6 +30,14 @@ export const EditFlowSettings: React.FC = ({
updateFlow,
}) => {
const [isMaxLength, setIsMaxLength] = useState(false);
+ const nameLists = useRef([]);
+ useEffect(() => {
+ readFlowsFromDatabase().then((flows) => {
+ flows.forEach((flow) => {
+ nameLists.current.push(flow.name);
+ });
+ });
+ }, []);
const handleNameChange = (event: ChangeEvent) => {
const { value } = event.target;
@@ -33,17 +46,21 @@ export const EditFlowSettings: React.FC = ({
} else {
setIsMaxLength(false);
}
-
+ if (!nameLists.current.includes(value)) {
+ setInvalidName(false);
+ } else {
+ setInvalidName(true);
+ }
setName(value);
};
const [desc, setDesc] = useState(
- flows.find((f) => f.id === tabId).description
+ flows.find((flow) => flow.id === tabId).description
);
const handleDescriptionChange = (event: ChangeEvent) => {
- flows.find((f) => f.id === tabId).description = event.target.value;
- setDesc(flows.find((f) => f.id === tabId).description);
+ flows.find((flow) => flow.id === tabId).description = event.target.value;
+ setDesc(flows.find((flow) => flow.id === tabId).description);
setDescription(event.target.value);
};
@@ -55,6 +72,9 @@ export const EditFlowSettings: React.FC = ({
{isMaxLength && (
Character limit reached
)}
+ {invalidName && (
+ Name already in use
+ )}
{
return {
...old,
diff --git a/src/frontend/src/components/chatComponent/index.tsx b/src/frontend/src/components/chatComponent/index.tsx
index 58c058866..2b32dc0c4 100644
--- a/src/frontend/src/components/chatComponent/index.tsx
+++ b/src/frontend/src/components/chatComponent/index.tsx
@@ -12,9 +12,8 @@ import { NodeType } from "../../types/flow";
export default function Chat({ flow }: ChatType) {
const [open, setOpen] = useState(false);
- const [isBuilt, setIsBuilt] = useState(false);
const [canOpen, setCanOpen] = useState(false);
- const { tabsState } = useContext(TabsContext);
+ const { tabsState, isBuilt, setIsBuilt } = useContext(TabsContext);
useEffect(() => {
const handleKeyDown = (event: KeyboardEvent) => {
@@ -63,8 +62,7 @@ export default function Chat({ flow }: ChatType) {
tabsState &&
tabsState[flow.id] &&
tabsState[flow.id].formKeysData &&
- tabsState[flow.id].formKeysData.input_keys &&
- Object.keys(tabsState[flow.id].formKeysData.input_keys).length > 0
+ tabsState[flow.id].formKeysData.input_keys !== null
) {
setCanOpen(true);
} else {
diff --git a/src/frontend/src/components/codeAreaComponent/index.tsx b/src/frontend/src/components/codeAreaComponent/index.tsx
index e95b03bcf..853fff1e8 100644
--- a/src/frontend/src/components/codeAreaComponent/index.tsx
+++ b/src/frontend/src/components/codeAreaComponent/index.tsx
@@ -1,6 +1,6 @@
import { useEffect, useState } from "react";
import CodeAreaModal from "../../modals/codeAreaModal";
-import { TextAreaComponentType } from "../../types/components";
+import { CodeAreaComponentType } from "../../types/components";
import IconComponent from "../genericIconComponent";
@@ -10,8 +10,9 @@ export default function CodeAreaComponent({
disabled,
editNode = false,
nodeClass,
+ dynamic,
setNodeClass,
-}: TextAreaComponentType) {
+}: CodeAreaComponentType) {
const [myValue, setMyValue] = useState(
typeof value == "string" ? value : JSON.stringify(value)
);
@@ -29,12 +30,13 @@ export default function CodeAreaComponent({
return (
{
- setMyValue(t);
- onChange(t);
+ setValue={(value: string) => {
+ setMyValue(value);
+ onChange(value);
}}
>
@@ -43,7 +45,7 @@ export default function CodeAreaComponent({
editNode
? "input-edit-node input-dialog"
: (disabled ? " input-disable input-ring " : "") +
- " input-primary text-muted-foreground "
+ " primary-input text-muted-foreground "
}
>
{myValue !== "" ? myValue : "Type something..."}
diff --git a/src/frontend/src/components/codeTabsComponent/index.tsx b/src/frontend/src/components/codeTabsComponent/index.tsx
index 1357e0c79..62cc87f36 100644
--- a/src/frontend/src/components/codeTabsComponent/index.tsx
+++ b/src/frontend/src/components/codeTabsComponent/index.tsx
@@ -132,21 +132,25 @@ export default function CodeTabsComponent({
}}
>
-
- {tabs.map((tab, index) => (
-
- {tab.name}
-
- ))}
-
- {Number(activeTab) < 3 && (
-
+ {tabs.length > 0 && tabs[0].name !== "" ? (
+
+ {tabs.map((tab, index) => (
+
+ {tab.name}
+
+ ))}
+
+ ) : (
+
+ )}
+ {Number(activeTab) < 4 && (
+
- {index < 3 ? (
-
- {tab.code}
-
- ) : index === 3 ? (
+ {index < 4 ? (
+ <>
+ {tab.description && (
+
+ )}
+
+ {tab.code}
+
+ >
+ ) : index === 4 ? (
<>
- {data.map((t: any, index) => (
+ {data.map((node: any, index) => (
- {tweaks.tweaksList.current.includes(t["data"]["id"]) && (
+ {tweaks.tweaksList.current.includes(
+ node["data"]["id"]
+ ) && (
@@ -214,78 +228,92 @@ export default function CodeTabsComponent({
- {Object.keys(t["data"]["node"]["template"])
+ {Object.keys(node["data"]["node"]["template"])
.filter(
- (n) =>
- n.charAt(0) !== "_" &&
- t.data.node.template[n].show &&
- (t.data.node.template[n].type === "str" ||
- t.data.node.template[n].type ===
- "bool" ||
- t.data.node.template[n].type ===
- "float" ||
- t.data.node.template[n].type ===
- "code" ||
- t.data.node.template[n].type ===
- "prompt" ||
- t.data.node.template[n].type ===
- "file" ||
- t.data.node.template[n].type === "int")
+ (templateField) =>
+ templateField.charAt(0) !== "_" &&
+ node.data.node.template[templateField]
+ .show &&
+ (node.data.node.template[templateField]
+ .type === "str" ||
+ node.data.node.template[templateField]
+ .type === "bool" ||
+ node.data.node.template[templateField]
+ .type === "float" ||
+ node.data.node.template[templateField]
+ .type === "code" ||
+ node.data.node.template[templateField]
+ .type === "prompt" ||
+ node.data.node.template[templateField]
+ .type === "file" ||
+ node.data.node.template[templateField]
+ .type === "int")
)
- .map((n, i) => {
+ .map((templateField, index) => {
return (
- {n}
+ {templateField}
- {t.data.node.template[n].type ===
- "str" &&
- !t.data.node.template[n].options ? (
+ {node.data.node.template[
+ templateField
+ ].type === "str" &&
+ !node.data.node.template[
+ templateField
+ ].options ? (
- {t.data.node.template[n]
- .list ? (
+ {node.data.node.template[
+ templateField
+ ].list ? (
{
+ onChange={(target) => {
setData((old) => {
let newInputList =
cloneDeep(old);
newInputList[
index
].data.node.template[
- n
- ].value = k;
+ templateField
+ ].value = target;
return newInputList;
});
tweaks.buildTweakObject(
- t["data"]["id"],
- k,
- t.data.node.template[n]
+ node["data"]["id"],
+ target,
+ node.data.node.template[
+ templateField
+ ]
);
}}
/>
- ) : t.data.node.template[n]
- .multiline ? (
+ ) : node.data.node.template[
+ templateField
+ ].multiline ? (
@@ -293,33 +321,38 @@ export default function CodeTabsComponent({
disabled={false}
editNode={true}
value={
- !t.data.node.template[
- n
+ !node.data.node
+ .template[
+ templateField
].value ||
- t.data.node.template[
- n
+ node.data.node
+ .template[
+ templateField
].value === ""
? ""
- : t.data.node
- .template[n]
- .value
+ : node.data.node
+ .template[
+ templateField
+ ].value
}
- onChange={(k) => {
+ onChange={(target) => {
setData((old) => {
let newInputList =
cloneDeep(old);
newInputList[
index
].data.node.template[
- n
- ].value = k;
+ templateField
+ ].value = target;
return newInputList;
});
tweaks.buildTweakObject(
- t["data"]["id"],
- k,
- t.data.node
- .template[n]
+ node["data"]["id"],
+ target,
+ node.data.node
+ .template[
+ templateField
+ ]
);
}}
/>
@@ -330,47 +363,55 @@ export default function CodeTabsComponent({
editNode={true}
disabled={false}
password={
- t.data.node.template[n]
- .password ?? false
+ node.data.node.template[
+ templateField
+ ].password ?? false
}
value={
- !t.data.node.template[n]
- .value ||
- t.data.node.template[n]
- .value === ""
+ !node.data.node.template[
+ templateField
+ ].value ||
+ node.data.node.template[
+ templateField
+ ].value === ""
? ""
- : t.data.node.template[
- n
+ : node.data.node
+ .template[
+ templateField
].value
}
- onChange={(k) => {
+ onChange={(target) => {
setData((old) => {
let newInputList =
cloneDeep(old);
newInputList[
index
].data.node.template[
- n
- ].value = k;
+ templateField
+ ].value = target;
return newInputList;
});
tweaks.buildTweakObject(
- t["data"]["id"],
- k,
- t.data.node.template[n]
+ node["data"]["id"],
+ target,
+ node.data.node.template[
+ templateField
+ ]
);
}}
/>
)}
- ) : t.data.node.template[n].type ===
- "bool" ? (
+ ) : node.data.node.template[
+ templateField
+ ].type === "bool" ? (
{" "}
{
setData((old) => {
@@ -379,31 +420,37 @@ export default function CodeTabsComponent({
newInputList[
index
].data.node.template[
- n
+ templateField
].value = e;
return newInputList;
});
tweaks.buildTweakObject(
- t["data"]["id"],
+ node["data"]["id"],
e,
- t.data.node.template[n]
+ node.data.node.template[
+ templateField
+ ]
);
}}
size="small"
disabled={false}
/>
- ) : t.data.node.template[n].type ===
- "file" ? (
+ ) : node.data.node.template[
+ templateField
+ ].type === "file" ? (
@@ -411,143 +458,176 @@ export default function CodeTabsComponent({
editNode={true}
disabled={false}
value={
- t.data.node.template[n]
- .value ?? ""
+ node.data.node.template[
+ templateField
+ ].value ?? ""
}
- onChange={(k: any) => {}}
+ onChange={(
+ target: any
+ ) => {}}
fileTypes={
- t.data.node.template[n]
- .fileTypes
+ node.data.node.template[
+ templateField
+ ].fileTypes
}
suffixes={
- t.data.node.template[n]
- .suffixes
+ node.data.node.template[
+ templateField
+ ].suffixes
}
onFileChange={(
- k: any
- ) => {}}
+ value: any
+ ) => {
+ node.data.node.template[
+ templateField
+ ].file_path = value;
+ }}
>
- ) : t.data.node.template[n].type ===
- "float" ? (
+ ) : node.data.node.template[
+ templateField
+ ].type === "float" ? (
{
+ onChange={(target) => {
setData((old) => {
let newInputList =
cloneDeep(old);
newInputList[
index
].data.node.template[
- n
- ].value = k;
+ templateField
+ ].value = target;
return newInputList;
});
tweaks.buildTweakObject(
- t["data"]["id"],
- k,
- t.data.node.template[n]
+ node["data"]["id"],
+ target,
+ node.data.node.template[
+ templateField
+ ]
);
}}
/>
- ) : t.data.node.template[n].type ===
- "str" &&
- t.data.node.template[n]
- .options ? (
+ ) : node.data.node.template[
+ templateField
+ ].type === "str" &&
+ node.data.node.template[
+ templateField
+ ].options ? (
{
+ onSelect={(target) => {
setData((old) => {
let newInputList =
cloneDeep(old);
newInputList[
index
].data.node.template[
- n
- ].value = k;
+ templateField
+ ].value = target;
return newInputList;
});
tweaks.buildTweakObject(
- t["data"]["id"],
- k,
- t.data.node.template[n]
+ node["data"]["id"],
+ target,
+ node.data.node.template[
+ templateField
+ ]
);
}}
value={
- !t.data.node.template[n]
- .value ||
- t.data.node.template[n]
- .value === ""
+ !node.data.node.template[
+ templateField
+ ].value ||
+ node.data.node.template[
+ templateField
+ ].value === ""
? ""
- : t.data.node.template[n]
- .value
+ : node.data.node.template[
+ templateField
+ ].value
}
>
- ) : t.data.node.template[n].type ===
- "int" ? (
+ ) : node.data.node.template[
+ templateField
+ ].type === "int" ? (
{
+ onChange={(target) => {
setData((old) => {
let newInputList =
cloneDeep(old);
newInputList[
index
].data.node.template[
- n
- ].value = k;
+ templateField
+ ].value = target;
return newInputList;
});
tweaks.buildTweakObject(
- t["data"]["id"],
- k,
- t.data.node.template[n]
+ node["data"]["id"],
+ target,
+ node.data.node.template[
+ templateField
+ ]
);
}}
/>
- ) : t.data.node.template[n].type ===
- "prompt" ? (
+ ) : node.data.node.template[
+ templateField
+ ].type === "prompt" ? (
@@ -555,44 +635,53 @@ export default function CodeTabsComponent({
editNode={true}
disabled={false}
value={
- !t.data.node.template[n]
- .value ||
- t.data.node.template[n]
- .value === ""
+ !node.data.node.template[
+ templateField
+ ].value ||
+ node.data.node.template[
+ templateField
+ ].value === ""
? ""
- : t.data.node.template[
- n
+ : node.data.node
+ .template[
+ templateField
].value
}
- onChange={(k) => {
+ onChange={(target) => {
setData((old) => {
let newInputList =
cloneDeep(old);
newInputList[
index
].data.node.template[
- n
- ].value = k;
+ templateField
+ ].value = target;
return newInputList;
});
tweaks.buildTweakObject(
- t["data"]["id"],
- k,
- t.data.node.template[n]
+ node["data"]["id"],
+ target,
+ node.data.node.template[
+ templateField
+ ]
);
}}
/>
- ) : t.data.node.template[n].type ===
- "code" ? (
+ ) : node.data.node.template[
+ templateField
+ ].type === "code" ? (
@@ -601,37 +690,43 @@ export default function CodeTabsComponent({
disabled={false}
editNode={true}
value={
- !t.data.node.template[n]
- .value ||
- t.data.node.template[n]
- .value === ""
+ !node.data.node.template[
+ templateField
+ ].value ||
+ node.data.node.template[
+ templateField
+ ].value === ""
? ""
- : t.data.node.template[
- n
+ : node.data.node
+ .template[
+ templateField
].value
}
- onChange={(k) => {
+ onChange={(target) => {
setData((old) => {
let newInputList =
cloneDeep(old);
newInputList[
index
].data.node.template[
- n
- ].value = k;
+ templateField
+ ].value = target;
return newInputList;
});
tweaks.buildTweakObject(
- t["data"]["id"],
- k,
- t.data.node.template[n]
+ node["data"]["id"],
+ target,
+ node.data.node.template[
+ templateField
+ ]
);
}}
/>
- ) : t.data.node.template[n].type ===
- "Any" ? (
+ ) : node.data.node.template[
+ templateField
+ ].type === "Any" ? (
"-"
) : (
diff --git a/src/frontend/src/components/floatComponent/index.tsx b/src/frontend/src/components/floatComponent/index.tsx
index 40d6fc3fe..ae3719959 100644
--- a/src/frontend/src/components/floatComponent/index.tsx
+++ b/src/frontend/src/components/floatComponent/index.tsx
@@ -25,12 +25,12 @@ export default function FloatComponent({
type="number"
step={step}
min={min}
- onInput={(e: React.ChangeEvent
) => {
- if (e.target.value < min.toString()) {
- e.target.value = min.toString();
+ onInput={(event: React.ChangeEvent) => {
+ if (event.target.value < min.toString()) {
+ event.target.value = min.toString();
}
- if (e.target.value > max.toString()) {
- e.target.value = max.toString();
+ if (event.target.value > max.toString()) {
+ event.target.value = max.toString();
}
}}
max={max}
@@ -40,8 +40,8 @@ export default function FloatComponent({
placeholder={
editNode ? "Number 0 to 1" : "Type a number from zero to one"
}
- onChange={(e) => {
- onChange(e.target.value);
+ onChange={(event) => {
+ onChange(event.target.value);
}}
/>
diff --git a/src/frontend/src/components/headerComponent/index.tsx b/src/frontend/src/components/headerComponent/index.tsx
index 91e9bbd8a..ddae6068c 100644
--- a/src/frontend/src/components/headerComponent/index.tsx
+++ b/src/frontend/src/components/headerComponent/index.tsx
@@ -31,9 +31,24 @@ export default function Header() {
return (
-
-
โ๏ธ
-
+ {tabId === "" || !tabId ? (
+
+ ) : (
+
+
โ๏ธ
+
+ )}
+
{flows.findIndex((f) => tabId === f.id) !== -1 && tabId !== "" && (
)}
diff --git a/src/frontend/src/components/inputComponent/index.tsx b/src/frontend/src/components/inputComponent/index.tsx
index 78b06c411..5345239ef 100644
--- a/src/frontend/src/components/inputComponent/index.tsx
+++ b/src/frontend/src/components/inputComponent/index.tsx
@@ -31,8 +31,8 @@ export default function InputComponent({
password && !editNode ? "pr-10" : ""
)}
placeholder={password && editNode ? "Key" : "Type something..."}
- onChange={(e) => {
- onChange(e.target.value);
+ onChange={(event) => {
+ onChange(event.target.value);
}}
/>
{password && (
diff --git a/src/frontend/src/components/inputFileComponent/index.tsx b/src/frontend/src/components/inputFileComponent/index.tsx
index 3f77734f3..53d1ad4de 100644
--- a/src/frontend/src/components/inputFileComponent/index.tsx
+++ b/src/frontend/src/components/inputFileComponent/index.tsx
@@ -49,11 +49,11 @@ export default function InputFileComponent({
input.style.display = "none"; // Hidden from view
input.multiple = false; // Allow only one file selection
- input.onchange = (e: Event) => {
+ input.onchange = (event: Event) => {
setLoading(true);
// Get the selected file
- const file = (e.target as HTMLInputElement).files?.[0];
+ const file = (event.target as HTMLInputElement).files?.[0];
// Check if the file type is correct
if (file && checkFileType(file.name)) {
@@ -66,12 +66,12 @@ export default function InputFileComponent({
const { file_path } = data;
console.log("File name:", file_path);
+ // sets the value that goes to the backend
+ onFileChange(file_path);
// Update the state and callback with the name of the file
// sets the value to the user
setMyValue(file.name);
onChange(file.name);
- // sets the value that goes to the backend
- onFileChange(file_path);
setLoading(false);
})
.catch(() => {
@@ -102,8 +102,8 @@ export default function InputFileComponent({
editNode
? "input-edit-node input-dialog text-muted-foreground"
: disabled
- ? "input-disable input-dialog input-primary"
- : "input-dialog input-primary text-muted-foreground"
+ ? "input-disable input-dialog primary-input"
+ : "input-dialog primary-input text-muted-foreground"
}
>
{myValue !== "" ? myValue : "No file"}
diff --git a/src/frontend/src/components/inputListComponent/index.tsx b/src/frontend/src/components/inputListComponent/index.tsx
index 13f5f7cca..c66e93c8f 100644
--- a/src/frontend/src/components/inputListComponent/index.tsx
+++ b/src/frontend/src/components/inputListComponent/index.tsx
@@ -25,18 +25,18 @@ export default function InputListComponent({
"flex flex-col gap-3"
)}
>
- {value.map((i, idx) => {
+ {value.map((singleValue, idx) => {
return (
{
+ onChange={(event) => {
let newInputList = _.cloneDeep(value);
- newInputList[idx] = e.target.value;
+ newInputList[idx] = event.target.value;
onChange(newInputList);
}}
/>
diff --git a/src/frontend/src/components/intComponent/index.tsx b/src/frontend/src/components/intComponent/index.tsx
index c43055b47..504f3b816 100644
--- a/src/frontend/src/components/intComponent/index.tsx
+++ b/src/frontend/src/components/intComponent/index.tsx
@@ -41,17 +41,17 @@ export default function IntComponent({
type="number"
step="1"
min={min}
- onInput={(e: React.ChangeEvent) => {
- if (e.target.value < min.toString()) {
- e.target.value = min.toString();
+ onInput={(event: React.ChangeEvent) => {
+ if (event.target.value < min.toString()) {
+ event.target.value = min.toString();
}
}}
value={value ?? ""}
className={editNode ? "input-edit-node" : ""}
disabled={disabled}
placeholder={editNode ? "Integer number" : "Type an integer number"}
- onChange={(e) => {
- onChange(e.target.value);
+ onChange={(event) => {
+ onChange(event.target.value);
}}
/>
diff --git a/src/frontend/src/components/promptComponent/index.tsx b/src/frontend/src/components/promptComponent/index.tsx
index d212ec08d..e3393512b 100644
--- a/src/frontend/src/components/promptComponent/index.tsx
+++ b/src/frontend/src/components/promptComponent/index.tsx
@@ -39,8 +39,8 @@ export default function PromptAreaComponent({
value={value}
buttonText="Check & Save"
modalTitle="Edit Prompt"
- setValue={(t: string) => {
- onChange(t);
+ setValue={(value: string) => {
+ onChange(value);
}}
nodeClass={nodeClass}
setNodeClass={setNodeClass}
@@ -51,7 +51,7 @@ export default function PromptAreaComponent({
editNode
? "input-edit-node input-dialog"
: (disabled ? " input-disable text-ring " : "") +
- " input-primary text-muted-foreground "
+ " primary-input text-muted-foreground "
}
>
{value !== "" ? value : "Type your prompt here..."}
diff --git a/src/frontend/src/components/textAreaComponent/index.tsx b/src/frontend/src/components/textAreaComponent/index.tsx
index cd598f01e..7c5af6da9 100644
--- a/src/frontend/src/components/textAreaComponent/index.tsx
+++ b/src/frontend/src/components/textAreaComponent/index.tsx
@@ -25,8 +25,8 @@ export default function TextAreaComponent({
disabled={disabled}
className={editNode ? "input-edit-node" : ""}
placeholder={"Type something..."}
- onChange={(e) => {
- onChange(e.target.value);
+ onChange={(event) => {
+ onChange(event.target.value);
}}
/>
@@ -35,8 +35,8 @@ export default function TextAreaComponent({
buttonText="Finishing Editing"
modalTitle="Edit Text"
value={value}
- setValue={(t: string) => {
- onChange(t);
+ setValue={(value: string) => {
+ onChange(value);
}}
>
{!editNode && (
diff --git a/src/frontend/src/components/toggleComponent/index.tsx b/src/frontend/src/components/toggleComponent/index.tsx
index 210c9223a..23acccee7 100644
--- a/src/frontend/src/components/toggleComponent/index.tsx
+++ b/src/frontend/src/components/toggleComponent/index.tsx
@@ -18,8 +18,8 @@ export default function ToggleComponent({
{
- setEnabled(x);
+ onChange={(isEnabled: boolean) => {
+ setEnabled(isEnabled);
}}
className={classNames(
enabled ? "bg-primary" : "bg-input",
diff --git a/src/frontend/src/components/toggleShadComponent/index.tsx b/src/frontend/src/components/toggleShadComponent/index.tsx
index 95ef6d062..5671b6465 100644
--- a/src/frontend/src/components/toggleShadComponent/index.tsx
+++ b/src/frontend/src/components/toggleShadComponent/index.tsx
@@ -35,8 +35,8 @@ export default function ToggleShadComponent({
disabled={disabled}
className=""
checked={enabled}
- onCheckedChange={(x: boolean) => {
- setEnabled(x);
+ onCheckedChange={(isEnabled: boolean) => {
+ setEnabled(isEnabled);
}}
>
diff --git a/src/frontend/src/components/ui/input.tsx b/src/frontend/src/components/ui/input.tsx
index c025f8de3..b014a57b3 100644
--- a/src/frontend/src/components/ui/input.tsx
+++ b/src/frontend/src/components/ui/input.tsx
@@ -9,7 +9,7 @@ const Input = React.forwardRef
(
return (
diff --git a/src/frontend/src/components/ui/rename-label.tsx b/src/frontend/src/components/ui/rename-label.tsx
index e1cef4996..8b12fdfb0 100644
--- a/src/frontend/src/components/ui/rename-label.tsx
+++ b/src/frontend/src/components/ui/rename-label.tsx
@@ -15,8 +15,8 @@ export default function RenameLabel(props) {
useEffect(() => {
if (isRename) {
setMyValue(props.value);
- document.addEventListener("keydown", (e) => {
- if (e.key === "Escape") {
+ document.addEventListener("keydown", (event) => {
+ if (event.key === "Escape") {
setIsRename(false);
props.setValue("");
}
@@ -67,8 +67,8 @@ export default function RenameLabel(props) {
}
}}
value={myValue}
- onChange={(e) => {
- setMyValue(e.target.value);
+ onChange={(event) => {
+ setMyValue(event.target.value);
}}
/>
) : (
diff --git a/src/frontend/src/components/ui/tooltip.tsx b/src/frontend/src/components/ui/tooltip.tsx
index 3d31ab66d..8ea9a9505 100644
--- a/src/frontend/src/components/ui/tooltip.tsx
+++ b/src/frontend/src/components/ui/tooltip.tsx
@@ -19,7 +19,7 @@ const TooltipContent = React.forwardRef<
ref={ref}
sideOffset={sideOffset}
className={cn(
- "overflow-hidden rounded-md border bg-popover px-3 py-1.5 text-sm text-popover-foreground shadow-md animate-in fade-in-50 data-[side=bottom]:slide-in-from-top-1 data-[side=left]:slide-in-from-right-1 data-[side=right]:slide-in-from-left-1 data-[side=top]:slide-in-from-bottom-1",
+ "overflow-y-auto rounded-md border bg-popover px-3 py-1.5 text-sm text-popover-foreground shadow-md animate-in fade-in-50 data-[side=bottom]:slide-in-from-top-1 data-[side=left]:slide-in-from-right-1 data-[side=right]:slide-in-from-left-1 data-[side=top]:slide-in-from-bottom-1",
className
)}
{...props}
diff --git a/src/frontend/src/constants/constants.ts b/src/frontend/src/constants/constants.ts
index 195570be0..ab9de7f63 100644
--- a/src/frontend/src/constants/constants.ts
+++ b/src/frontend/src/constants/constants.ts
@@ -143,6 +143,12 @@ export const TEXT_DIALOG_SUBTITLE = "Edit your text.";
export const IMPORT_DIALOG_SUBTITLE =
"Upload a JSON file or select from the available community examples.";
+/**
+ * The text that shows when a tooltip is empty
+ * @constant
+ */
+export const TOOLTIP_EMPTY = "No compatible components found.";
+
/**
* The base text for subtitle of code dialog
* @constant
@@ -491,3 +497,95 @@ export const NOUNS: string[] = [
*
*/
export const USER_PROJECTS_HEADER = "My Collection";
+
+/**
+ * URLs excluded from error retries.
+ * @constant
+ *
+ */
+export const URL_EXCLUDED_FROM_ERROR_RETRIES = [
+ "/api/v1/validate/code",
+ "/api/v1/custom_component",
+ "/api/v1/validate/prompt",
+];
+
+export const tabsCode = [];
+
+export function tabsArray(codes: string[], method: number) {
+ if (!method) return;
+ if (method === 0) {
+ return [
+ {
+ name: "cURL",
+ mode: "bash",
+ image: "https://curl.se/logo/curl-symbol-transparent.png",
+ language: "sh",
+ code: codes[0],
+ },
+ {
+ name: "Python API",
+ mode: "python",
+ image:
+ "https://images.squarespace-cdn.com/content/v1/5df3d8c5d2be5962e4f87890/1628015119369-OY4TV3XJJ53ECO0W2OLQ/Python+API+Training+Logo.png?format=1000w",
+ language: "py",
+ code: codes[1],
+ },
+ {
+ name: "Python Code",
+ mode: "python",
+ image: "https://cdn-icons-png.flaticon.com/512/5968/5968350.png",
+ language: "py",
+ code: codes[2],
+ },
+ {
+ name: "Chat Widget HTML",
+ description:
+ "Insert this code anywhere in your <body> tag. To use with react and other libs, check our documentation .",
+ mode: "html",
+ image: "https://cdn-icons-png.flaticon.com/512/5968/5968350.png",
+ language: "py",
+ code: codes[3],
+ },
+ ];
+ }
+ return [
+ {
+ name: "cURL",
+ mode: "bash",
+ image: "https://curl.se/logo/curl-symbol-transparent.png",
+ language: "sh",
+ code: codes[0],
+ },
+ {
+ name: "Python API",
+ mode: "python",
+ image:
+ "https://images.squarespace-cdn.com/content/v1/5df3d8c5d2be5962e4f87890/1628015119369-OY4TV3XJJ53ECO0W2OLQ/Python+API+Training+Logo.png?format=1000w",
+ language: "py",
+ code: codes[1],
+ },
+ {
+ name: "Python Code",
+ mode: "python",
+ language: "py",
+ image: "https://cdn-icons-png.flaticon.com/512/5968/5968350.png",
+ code: codes[2],
+ },
+ {
+ name: "Chat Widget HTML",
+ description:
+ "Insert this code anywhere in your <body> tag. To use with react and other libs, check our documentation .",
+ mode: "html",
+ image: "https://cdn-icons-png.flaticon.com/512/5968/5968350.png",
+ language: "py",
+ code: codes[3],
+ },
+ {
+ name: "Tweaks",
+ mode: "python",
+ image: "https://cdn-icons-png.flaticon.com/512/5968/5968350.png",
+ language: "py",
+ code: codes[4],
+ },
+ ];
+}
diff --git a/src/frontend/src/contexts/tabsContext.tsx b/src/frontend/src/contexts/tabsContext.tsx
index 30df2a8bd..5adc72e72 100644
--- a/src/frontend/src/contexts/tabsContext.tsx
+++ b/src/frontend/src/contexts/tabsContext.tsx
@@ -20,7 +20,11 @@ import {
import { APIClassType, APITemplateType } from "../types/api";
import { FlowType, NodeType } from "../types/flow";
import { TabsContextType, TabsState } from "../types/tabs";
-import { addVersionToDuplicates, updateIds, updateTemplate } from "../utils/reactflowUtils";
+import {
+ addVersionToDuplicates,
+ updateIds,
+ updateTemplate,
+} from "../utils/reactflowUtils";
import { getRandomDescription, getRandomName } from "../utils/utils";
import { alertContext } from "./alertContext";
import { typesContext } from "./typesContext";
@@ -40,6 +44,8 @@ const TabsContextInitialValue: TabsContextType = {
downloadFlows: () => {},
uploadFlows: () => {},
uploadFlow: () => {},
+ isBuilt: false,
+ setIsBuilt: (state: boolean) => {},
hardReset: () => {},
saveFlow: async (flow: FlowType) => {},
lastCopiedSelection: null,
@@ -310,11 +316,13 @@ export function TabsProvider({ children }: { children: ReactNode }) {
const input = document.createElement("input");
input.type = "file";
// add a change event listener to the file input
- input.onchange = (e: Event) => {
+ input.onchange = (event: Event) => {
// check if the file type is application/json
- if ((e.target as HTMLInputElement).files[0].type === "application/json") {
+ if (
+ (event.target as HTMLInputElement).files[0].type === "application/json"
+ ) {
// get the file from the file input
- const file = (e.target as HTMLInputElement).files[0];
+ const file = (event.target as HTMLInputElement).files[0];
// read the file as text
const formData = new FormData();
formData.append("file", file);
@@ -353,12 +361,12 @@ export function TabsProvider({ children }: { children: ReactNode }) {
let idsMap = {};
let nodes = reactFlowInstance.getNodes();
let edges = reactFlowInstance.getEdges();
- selectionInstance.nodes.forEach((n) => {
- if (n.position.y < minimumY) {
- minimumY = n.position.y;
+ selectionInstance.nodes.forEach((node) => {
+ if (node.position.y < minimumY) {
+ minimumY = node.position.y;
}
- if (n.position.x < minimumX) {
- minimumX = n.position.x;
+ if (node.position.x < minimumX) {
+ minimumX = node.position.x;
}
});
@@ -366,43 +374,43 @@ export function TabsProvider({ children }: { children: ReactNode }) {
? { x: position.paneX + position.x, y: position.paneY + position.y }
: reactFlowInstance.project({ x: position.x, y: position.y });
- selectionInstance.nodes.forEach((n: NodeType) => {
+ selectionInstance.nodes.forEach((node: NodeType) => {
// Generate a unique node ID
- let newId = getNodeId(n.data.type);
- idsMap[n.id] = newId;
+ let newId = getNodeId(node.data.type);
+ idsMap[node.id] = newId;
// Create a new node object
const newNode: NodeType = {
id: newId,
type: "genericNode",
position: {
- x: insidePosition.x + n.position.x - minimumX,
- y: insidePosition.y + n.position.y - minimumY,
+ x: insidePosition.x + node.position.x - minimumX,
+ y: insidePosition.y + node.position.y - minimumY,
},
data: {
- ..._.cloneDeep(n.data),
+ ..._.cloneDeep(node.data),
id: newId,
},
};
// Add the new node to the list of nodes in state
nodes = nodes
- .map((e) => ({ ...e, selected: false }))
+ .map((node) => ({ ...node, selected: false }))
.concat({ ...newNode, selected: false });
});
reactFlowInstance.setNodes(nodes);
- selectionInstance.edges.forEach((e) => {
- let source = idsMap[e.source];
- let target = idsMap[e.target];
- let sourceHandleSplitted = e.sourceHandle.split("|");
+ selectionInstance.edges.forEach((edge) => {
+ let source = idsMap[edge.source];
+ let target = idsMap[edge.target];
+ let sourceHandleSplitted = edge.sourceHandle.split("|");
let sourceHandle =
sourceHandleSplitted[0] +
"|" +
source +
"|" +
sourceHandleSplitted.slice(2).join("|");
- let targetHandleSplitted = e.targetHandle.split("|");
+ let targetHandleSplitted = edge.targetHandle.split("|");
let targetHandle =
targetHandleSplitted.slice(0, -1).join("|") + "|" + target;
let id =
@@ -427,7 +435,7 @@ export function TabsProvider({ children }: { children: ReactNode }) {
animated: targetHandle.split("|")[0] === "Text",
selected: false,
},
- edges.map((e) => ({ ...e, selected: false }))
+ edges.map((edge) => ({ ...edge, selected: false }))
);
});
reactFlowInstance.setEdges(edges);
@@ -587,10 +595,14 @@ export function TabsProvider({ children }: { children: ReactNode }) {
}
}
+ const [isBuilt, setIsBuilt] = useState(false);
+
return (
n.id !== idx)
+ reactFlowInstance.getNodes().filter((node: Node) => node.id !== idx)
);
reactFlowInstance.setEdges(
reactFlowInstance
.getEdges()
- .filter((ns) => ns.source !== idx && ns.target !== idx)
+ .filter((edge) => edge.source !== idx && edge.target !== idx)
);
}
return (
diff --git a/src/frontend/src/contexts/undoRedoContext.tsx b/src/frontend/src/contexts/undoRedoContext.tsx
index dbe0baa45..e2473c39e 100644
--- a/src/frontend/src/contexts/undoRedoContext.tsx
+++ b/src/frontend/src/contexts/undoRedoContext.tsx
@@ -45,14 +45,18 @@ export function UndoRedoProvider({ children }) {
const [past, setPast] = useState(flows.map(() => []));
const [future, setFuture] = useState(flows.map(() => []));
const [tabIndex, setTabIndex] = useState(
- flows.findIndex((f) => f.id === tabId)
+ flows.findIndex((flow) => flow.id === tabId)
);
useEffect(() => {
// whenever the flows variable changes, we need to add one array to the past and future states
- setPast((old) => flows.map((f, i) => (old[i] ? old[i] : [])));
- setFuture((old) => flows.map((f, i) => (old[i] ? old[i] : [])));
- setTabIndex(flows.findIndex((f) => f.id === tabId));
+ setPast((old) =>
+ flows.map((flow, index) => (old[index] ? old[index] : []))
+ );
+ setFuture((old) =>
+ flows.map((flow, index) => (old[index] ? old[index] : []))
+ );
+ setTabIndex(flows.findIndex((flow) => flow.id === tabId));
}, [flows, tabId]);
const { setNodes, setEdges, getNodes, getEdges } = useReactFlow();
diff --git a/src/frontend/src/controllers/API/api.tsx b/src/frontend/src/controllers/API/api.tsx
index 8dd9eac9f..9e716ebb7 100644
--- a/src/frontend/src/controllers/API/api.tsx
+++ b/src/frontend/src/controllers/API/api.tsx
@@ -1,5 +1,6 @@
import axios, { AxiosError, AxiosInstance } from "axios";
import { useContext, useEffect, useRef } from "react";
+import { URL_EXCLUDED_FROM_ERROR_RETRIES } from "../../constants/constants";
import { alertContext } from "../../contexts/alertContext";
// Create a new Axios instance
@@ -15,6 +16,9 @@ function ApiInterceptor() {
const interceptor = api.interceptors.response.use(
(response) => response,
async (error: AxiosError) => {
+ if (URL_EXCLUDED_FROM_ERROR_RETRIES.includes(error.config?.url)) {
+ return Promise.reject(error);
+ }
let retryCount = 0;
while (retryCount < 4) {
@@ -31,7 +35,7 @@ function ApiInterceptor() {
"Refresh the page",
"Use a new flow tab",
"Check if the backend is up",
- "Endpoint: " + error.config.url,
+ "Endpoint: " + error.config?.url,
],
});
return Promise.reject(error);
diff --git a/src/frontend/src/controllers/API/index.ts b/src/frontend/src/controllers/API/index.ts
index 6d24e7d09..9491f8973 100644
--- a/src/frontend/src/controllers/API/index.ts
+++ b/src/frontend/src/controllers/API/index.ts
@@ -339,3 +339,10 @@ export async function uploadFile(
formData.append("file", file);
return await api.post(`/api/v1/upload/${id}`, formData);
}
+
+export async function postCustomComponent(
+ code: string,
+ apiClass: APIClassType
+): Promise> {
+ return await api.post(`/api/v1/custom_component`, { code });
+}
diff --git a/src/frontend/src/icons/GradientSparkles/index.tsx b/src/frontend/src/icons/GradientSparkles/index.tsx
new file mode 100644
index 000000000..b8f3534d5
--- /dev/null
+++ b/src/frontend/src/icons/GradientSparkles/index.tsx
@@ -0,0 +1,22 @@
+import { Infinity } from "lucide-react";
+import { forwardRef } from "react";
+
+const GradientSparkles = forwardRef>(
+ (props, ref) => {
+ return (
+ <>
+
+
+
+
+
+
+
+
+
+ >
+ );
+ }
+);
+
+export default GradientSparkles;
diff --git a/src/frontend/src/index.tsx b/src/frontend/src/index.tsx
index 31d8f21f1..2542f4903 100644
--- a/src/frontend/src/index.tsx
+++ b/src/frontend/src/index.tsx
@@ -5,9 +5,12 @@ import ContextWrapper from "./contexts";
import reportWebVitals from "./reportWebVitals";
import { ApiInterceptor } from "./controllers/API/api";
+// @ts-ignore
import "./style/index.css";
-import "./style/classes.css";
+// @ts-ignore
import "./style/applies.css";
+// @ts-ignore
+import "./style/classes.css";
const root = ReactDOM.createRoot(
document.getElementById("root") as HTMLElement
diff --git a/src/frontend/src/modals/ApiModal/index.tsx b/src/frontend/src/modals/ApiModal/index.tsx
index 1cbcff6c2..307e48208 100644
--- a/src/frontend/src/modals/ApiModal/index.tsx
+++ b/src/frontend/src/modals/ApiModal/index.tsx
@@ -21,6 +21,7 @@ import {
getCurlCode,
getPythonApiCode,
getPythonCode,
+ getWidgetCode,
} from "../../utils/utils";
import BaseModal from "../baseModal";
@@ -29,9 +30,11 @@ const ApiModal = forwardRef(
{
flow,
children,
+ disable,
}: {
flow: FlowType;
children: ReactNode;
+ disable: boolean;
},
ref
) => {
@@ -43,6 +46,7 @@ const ApiModal = forwardRef(
const pythonApiCode = getPythonApiCode(flow, tweak.current, tabsState);
const curl_code = getCurlCode(flow, tweak.current, tabsState);
const pythonCode = getPythonCode(flow, tweak.current, tabsState);
+ const widgetCode = getWidgetCode(flow, tabsState);
const tweaksCode = buildTweaks(flow);
const [tabs, setTabs] = useState([
{
@@ -67,6 +71,15 @@ const ApiModal = forwardRef(
language: "py",
code: pythonCode,
},
+ {
+ name: "Chat Widget HTML",
+ description:
+ "Insert this code anywhere in your <body> tag. To use with react and other libs, check our documentation .",
+ mode: "html",
+ image: "https://cdn-icons-png.flaticon.com/512/5968/5968350.png",
+ language: "py",
+ code: widgetCode,
+ },
]);
function startState() {
@@ -111,6 +124,15 @@ const ApiModal = forwardRef(
image: "https://cdn-icons-png.flaticon.com/512/5968/5968350.png",
code: pythonCode,
},
+ {
+ name: "Chat Widget HTML",
+ description:
+ "Insert this code anywhere in your <body> tag. To use with react and other libs, check our documentation .",
+ mode: "html",
+ image: "https://cdn-icons-png.flaticon.com/512/5968/5968350.png",
+ language: "py",
+ code: widgetCode,
+ },
{
name: "Tweaks",
mode: "python",
@@ -143,6 +165,15 @@ const ApiModal = forwardRef(
language: "py",
code: pythonCode,
},
+ {
+ name: "Chat Widget HTML",
+ description:
+ "Insert this code anywhere in your <body> tag. To use with react and other libs, check our documentation .",
+ mode: "html",
+ image: "https://cdn-icons-png.flaticon.com/512/5968/5968350.png",
+ language: "py",
+ code: widgetCode,
+ },
]);
}
}, [flow["data"]["nodes"], open]);
@@ -150,22 +181,22 @@ const ApiModal = forwardRef(
function filterNodes() {
let arrNodesWithValues = [];
- flow["data"]["nodes"].forEach((t) => {
- Object.keys(t["data"]["node"]["template"])
+ flow["data"]["nodes"].forEach((node) => {
+ Object.keys(node["data"]["node"]["template"])
.filter(
- (n) =>
- n.charAt(0) !== "_" &&
- t.data.node.template[n].show &&
- (t.data.node.template[n].type === "str" ||
- t.data.node.template[n].type === "bool" ||
- t.data.node.template[n].type === "float" ||
- t.data.node.template[n].type === "code" ||
- t.data.node.template[n].type === "prompt" ||
- t.data.node.template[n].type === "file" ||
- t.data.node.template[n].type === "int")
+ (templateField) =>
+ templateField.charAt(0) !== "_" &&
+ node.data.node.template[templateField].show &&
+ (node.data.node.template[templateField].type === "str" ||
+ node.data.node.template[templateField].type === "bool" ||
+ node.data.node.template[templateField].type === "float" ||
+ node.data.node.template[templateField].type === "code" ||
+ node.data.node.template[templateField].type === "prompt" ||
+ node.data.node.template[templateField].type === "file" ||
+ node.data.node.template[templateField].type === "int")
)
.map((n, i) => {
- arrNodesWithValues.push(t["id"]);
+ arrNodesWithValues.push(node["id"]);
});
});
@@ -210,13 +241,15 @@ const ApiModal = forwardRef(
tweak.current.push(newTweak);
}
- const pythonApiCode = getPythonApiCode(flow, tweak.current);
- const curl_code = getCurlCode(flow, tweak.current);
- const pythonCode = getPythonCode(flow, tweak.current);
+ const pythonApiCode = getPythonApiCode(flow, tweak.current, tabsState);
+ const curl_code = getCurlCode(flow, tweak.current, tabsState);
+ const pythonCode = getPythonCode(flow, tweak.current, tabsState);
+ const widgetCode = getWidgetCode(flow, tabsState);
tabs[0].code = curl_code;
tabs[1].code = pythonApiCode;
tabs[2].code = pythonCode;
+ tabs[3].code = widgetCode;
setTweak(tweak.current);
}
@@ -253,7 +286,7 @@ const ApiModal = forwardRef(
}
return (
-
+
{children}
Code
diff --git a/src/frontend/src/modals/EditNodeModal/index.tsx b/src/frontend/src/modals/EditNodeModal/index.tsx
index 3e1cd32a3..36c38e963 100644
--- a/src/frontend/src/modals/EditNodeModal/index.tsx
+++ b/src/frontend/src/modals/EditNodeModal/index.tsx
@@ -49,13 +49,15 @@ const EditNodeModal = forwardRef(
const { reactFlowInstance } = useContext(typesContext);
let disabled =
- reactFlowInstance?.getEdges().some((e) => e.targetHandle === data.id) ??
- false;
+ reactFlowInstance
+ ?.getEdges()
+ .some((edge) => edge.targetHandle === data.id) ?? false;
- function changeAdvanced(n) {
+ function changeAdvanced(templateParam) {
setMyData((old) => {
let newData = cloneDeep(old);
- newData.node.template[n].advanced = !newData.node.template[n].advanced;
+ newData.node.template[templateParam].advanced =
+ !newData.node.template[templateParam].advanced;
return newData;
});
}
@@ -112,51 +114,65 @@ const EditNodeModal = forwardRef(
{Object.keys(myData.node.template)
.filter(
- (t) =>
- t.charAt(0) !== "_" &&
- myData.node.template[t].show &&
- (myData.node.template[t].type === "str" ||
- myData.node.template[t].type === "bool" ||
- myData.node.template[t].type === "float" ||
- myData.node.template[t].type === "code" ||
- myData.node.template[t].type === "prompt" ||
- myData.node.template[t].type === "file" ||
- myData.node.template[t].type === "int")
+ (templateParam) =>
+ templateParam.charAt(0) !== "_" &&
+ myData.node.template[templateParam].show &&
+ (myData.node.template[templateParam].type ===
+ "str" ||
+ myData.node.template[templateParam].type ===
+ "bool" ||
+ myData.node.template[templateParam].type ===
+ "float" ||
+ myData.node.template[templateParam].type ===
+ "code" ||
+ myData.node.template[templateParam].type ===
+ "prompt" ||
+ myData.node.template[templateParam].type ===
+ "file" ||
+ myData.node.template[templateParam].type ===
+ "int")
)
- .map((n, i) => (
-
+ .map((templateParam, index) => (
+
- {myData.node.template[n].name
- ? myData.node.template[n].name
- : myData.node.template[n].display_name}
+ {myData.node.template[templateParam].name
+ ? myData.node.template[templateParam].name
+ : myData.node.template[templateParam]
+ .display_name}
- {myData.node.template[n].type === "str" &&
- !myData.node.template[n].options ? (
+ {myData.node.template[templateParam].type ===
+ "str" &&
+ !myData.node.template[templateParam].options ? (
- {myData.node.template[n].list ? (
+ {myData.node.template[templateParam].list ? (
{
- handleOnNewValue(t, n);
+ onChange={(value: string[]) => {
+ handleOnNewValue(value, templateParam);
}}
/>
- ) : myData.node.template[n].multiline ? (
+ ) : myData.node.template[templateParam]
+ .multiline ? (
{
- handleOnNewValue(t, n);
+ onChange={(value: string) => {
+ handleOnNewValue(value, templateParam);
}}
/>
) : (
@@ -164,112 +180,160 @@ const EditNodeModal = forwardRef(
editNode={true}
disabled={disabled}
password={
- myData.node.template[n].password ??
- false
+ myData.node.template[templateParam]
+ .password ?? false
}
value={
- myData.node.template[n].value ?? ""
+ myData.node.template[templateParam]
+ .value ?? ""
}
- onChange={(t) => {
- handleOnNewValue(t, n);
+ onChange={(value) => {
+ handleOnNewValue(value, templateParam);
}}
/>
)}
- ) : myData.node.template[n].type === "bool" ? (
+ ) : myData.node.template[templateParam].type ===
+ "bool" ? (
{" "}
{
- handleOnNewValue(t, n);
+ enabled={
+ myData.node.template[templateParam].value
+ }
+ setEnabled={(isEnabled) => {
+ handleOnNewValue(
+ isEnabled,
+ templateParam
+ );
}}
size="small"
/>
- ) : myData.node.template[n].type === "float" ? (
+ ) : myData.node.template[templateParam].type ===
+ "float" ? (
{
- handleOnNewValue(t, n);
+ value={
+ myData.node.template[templateParam]
+ .value ?? ""
+ }
+ onChange={(value) => {
+ handleOnNewValue(value, templateParam);
}}
/>
- ) : myData.node.template[n].type === "str" &&
- myData.node.template[n].options ? (
+ ) : myData.node.template[templateParam].type ===
+ "str" &&
+ myData.node.template[templateParam].options ? (
handleOnNewValue(t, n)}
+ options={
+ myData.node.template[templateParam]
+ .options
+ }
+ onSelect={(value) =>
+ handleOnNewValue(value, templateParam)
+ }
value={
- myData.node.template[n].value ??
- "Choose an option"
+ myData.node.template[templateParam]
+ .value ?? "Choose an option"
}
>
- ) : myData.node.template[n].type === "int" ? (
+ ) : myData.node.template[templateParam].type ===
+ "int" ? (
{
- handleOnNewValue(t, n);
+ value={
+ myData.node.template[templateParam]
+ .value ?? ""
+ }
+ onChange={(value) => {
+ handleOnNewValue(value, templateParam);
}}
/>
- ) : myData.node.template[n].type === "file" ? (
+ ) : myData.node.template[templateParam].type ===
+ "file" ? (
{
- handleOnNewValue(t, n);
+ value={
+ myData.node.template[templateParam]
+ .value ?? ""
+ }
+ onChange={(value: string) => {
+ handleOnNewValue(value, templateParam);
}}
fileTypes={
- myData.node.template[n].fileTypes
+ myData.node.template[templateParam]
+ .fileTypes
}
- suffixes={myData.node.template[n].suffixes}
- onFileChange={(t: string) => {
- handleOnNewValue(t, n);
+ suffixes={
+ myData.node.template[templateParam]
+ .suffixes
+ }
+ onFileChange={(filePath: string) => {
+ data.node.template[
+ templateParam
+ ].file_path = filePath;
}}
>
- ) : myData.node.template[n].type === "prompt" ? (
+ ) : myData.node.template[templateParam].type ===
+ "prompt" ? (
{
myData.node = nodeClass;
}}
- value={myData.node.template[n].value ?? ""}
- onChange={(t: string) => {
- handleOnNewValue(t, n);
+ value={
+ myData.node.template[templateParam]
+ .value ?? ""
+ }
+ onChange={(value: string) => {
+ handleOnNewValue(value, templateParam);
}}
/>
- ) : myData.node.template[n].type === "code" ? (
+ ) : myData.node.template[templateParam].type ===
+ "code" ? (
{
+ data.node = nodeClass;
+ }}
+ nodeClass={data.node}
disabled={disabled}
editNode={true}
- value={myData.node.template[n].value ?? ""}
- onChange={(t: string) => {
- handleOnNewValue(t, n);
+ value={
+ myData.node.template[templateParam]
+ .value ?? ""
+ }
+ onChange={(value: string) => {
+ handleOnNewValue(value, templateParam);
}}
/>
- ) : myData.node.template[n].type === "Any" ? (
+ ) : myData.node.template[templateParam].type ===
+ "Any" ? (
"-"
) : (
@@ -278,8 +342,13 @@ const EditNodeModal = forwardRef(
changeAdvanced(n)}
+ enabled={
+ !myData.node.template[templateParam]
+ .advanced
+ }
+ setEnabled={(e) =>
+ changeAdvanced(templateParam)
+ }
disabled={disabled}
size="small"
/>
diff --git a/src/frontend/src/modals/NodeModal/components/ModalField/index.tsx b/src/frontend/src/modals/NodeModal/components/ModalField/index.tsx
index 15c0984ea..4b80248db 100644
--- a/src/frontend/src/modals/NodeModal/components/ModalField/index.tsx
+++ b/src/frontend/src/modals/NodeModal/components/ModalField/index.tsx
@@ -142,7 +142,7 @@ export default function ModalField({
fileTypes={data.node.template[name].fileTypes}
suffixes={data.node.template[name].suffixes}
onFileChange={(t: string) => {
- data.node.template[name].content = t;
+ data.node.template[name].file_path = t;
}}
>
@@ -160,6 +160,11 @@ export default function ModalField({
) : type === "code" ? (
{
+ data.node = nodeClass;
+ }}
+ nodeClass={data.node}
disabled={false}
value={data.node.template[name].value ?? ""}
onChange={(t: string) => {
diff --git a/src/frontend/src/modals/baseModal/index.tsx b/src/frontend/src/modals/baseModal/index.tsx
index 81bc206da..142ad223b 100644
--- a/src/frontend/src/modals/baseModal/index.tsx
+++ b/src/frontend/src/modals/baseModal/index.tsx
@@ -46,11 +46,13 @@ interface BaseModalProps {
];
open?: boolean;
setOpen?: (open: boolean) => void;
+ disable?: boolean;
size?: "smaller" | "small" | "medium" | "large" | "large-h-full";
}
function BaseModal({
open,
setOpen,
+ disable = false,
children,
size = "large",
}: BaseModalProps) {
@@ -99,7 +101,10 @@ function BaseModal({
//UPDATE COLORS AND STYLE CLASSSES
return (
-
+
{triggerChild}
@@ -107,8 +112,9 @@ function BaseModal({
{ContentChild}
-
- {ContentFooter}
+ {ContentFooter && (
+ {ContentFooter}
+ )}
);
diff --git a/src/frontend/src/modals/codeAreaModal/index.tsx b/src/frontend/src/modals/codeAreaModal/index.tsx
index 39b373107..935ae2964 100644
--- a/src/frontend/src/modals/codeAreaModal/index.tsx
+++ b/src/frontend/src/modals/codeAreaModal/index.tsx
@@ -3,14 +3,16 @@ import "ace-builds/src-noconflict/ext-language_tools";
import "ace-builds/src-noconflict/mode-python";
import "ace-builds/src-noconflict/theme-github";
import "ace-builds/src-noconflict/theme-twilight";
-import { ReactNode, useContext, useState } from "react";
+// import "ace-builds/webpack-resolver";
+import { ReactNode, useContext, useEffect, useState } from "react";
import AceEditor from "react-ace";
import IconComponent from "../../components/genericIconComponent";
import { Button } from "../../components/ui/button";
import { CODE_PROMPT_DIALOG_SUBTITLE } from "../../constants/constants";
import { alertContext } from "../../contexts/alertContext";
import { darkContext } from "../../contexts/darkContext";
-import { postValidateCode } from "../../controllers/API";
+import { typesContext } from "../../contexts/typesContext";
+import { postCustomComponent, postValidateCode } from "../../controllers/API";
import { APIClassType } from "../../types/api";
import BaseModal from "../baseModal";
@@ -20,18 +22,34 @@ export default function CodeAreaModal({
nodeClass,
setNodeClass,
children,
+ dynamic,
}: {
setValue: (value: string) => void;
value: string;
- nodeClass: APIClassType;
+ nodeClass?: APIClassType;
children: ReactNode;
- setNodeClass: (Class: APIClassType) => void;
+ setNodeClass?: (Class: APIClassType) => void;
+ dynamic?: boolean;
}) {
const [code, setCode] = useState(value);
const { dark } = useContext(darkContext);
+ const { reactFlowInstance } = useContext(typesContext);
+ const [height, setHeight] = useState(null);
const { setErrorData, setSuccessData } = useContext(alertContext);
+ const [error, setError] = useState<{
+ detail: { error: string; traceback: string };
+ }>(null);
- function handleClick() {
+ useEffect(() => {
+ // if nodeClass.template has more fields other than code and dynamic is true
+ // do not run handleClick
+ if (dynamic && Object.keys(nodeClass.template).length > 2) {
+ return;
+ }
+ processCode();
+ }, []);
+
+ function processNonDynamicField() {
postValidateCode(code)
.then((apiReturn) => {
if (apiReturn.data) {
@@ -41,8 +59,9 @@ export default function CodeAreaModal({
setSuccessData({
title: "Code is ready to run",
});
- setValue(code);
setOpen(false);
+ setValue(code);
+ // setValue(code);
} else {
if (funcErrors.length !== 0) {
setErrorData({
@@ -70,8 +89,56 @@ export default function CodeAreaModal({
});
}
+ function processDynamicField() {
+ postCustomComponent(code, nodeClass)
+ .then((apiReturn) => {
+ const { data } = apiReturn;
+ if (data) {
+ setNodeClass(data);
+ setValue(code);
+ setError({ detail: { error: undefined, traceback: undefined } });
+ setOpen(false);
+ }
+ })
+ .catch((err) => {
+ setError(err.response.data);
+ });
+ }
+
+ function processCode() {
+ if (!dynamic) {
+ processNonDynamicField();
+ } else {
+ processDynamicField();
+ }
+ }
+
+ function handleClick() {
+ processCode();
+ }
+
+ useEffect(() => {
+ // Function to be executed after the state changes
+ const delayedFunction = setTimeout(() => {
+ if (error?.detail.error !== undefined) {
+ //trigger to update the height, does not really apply any height
+ setHeight("90%");
+ }
+ //600 to happen after the transition of 500ms
+ }, 600);
+
+ // Cleanup function to clear the timeout if the component unmounts or the state changes again
+ return () => {
+ clearTimeout(delayedFunction);
+ };
+ }, [error, setHeight]);
+
const [open, setOpen] = useState(false);
+ useEffect(() => {
+ setCode(value);
+ }, [value, open]);
+
return (
{children}
@@ -89,6 +156,7 @@ export default function CodeAreaModal({
{
setCode(value);
}}
- className="h-full w-full rounded-lg border-[1px] border-border custom-scroll"
+ className="h-full w-full rounded-lg border-[1px] border-gray-300 custom-scroll dark:border-gray-600"
/>
+
+
+
+ {error?.detail?.error}
+
+
+
+ {error?.detail?.traceback}
+
+
+
+
Check & Save
diff --git a/src/frontend/src/modals/exportModal/index.tsx b/src/frontend/src/modals/exportModal/index.tsx
index ee6434f67..bc75c2b34 100644
--- a/src/frontend/src/modals/exportModal/index.tsx
+++ b/src/frontend/src/modals/exportModal/index.tsx
@@ -12,9 +12,12 @@ const ExportModal = forwardRef((props: { children: ReactNode }, ref) => {
const { flows, tabId, updateFlow, downloadFlow, saveFlow } =
useContext(TabsContext);
const [checked, setChecked] = useState(false);
- const [name, setName] = useState(flows.find((f) => f.id === tabId).name);
+ const [name, setName] = useState(
+ flows.find((flow) => flow.id === tabId).name
+ );
+ const [invalidName, setInvalidName] = useState(false);
const [description, setDescription] = useState(
- flows.find((f) => f.id === tabId).description
+ flows.find((flow) => flow.id === tabId).description
);
const [open, setOpen] = useState(false);
return (
@@ -30,6 +33,8 @@ const ExportModal = forwardRef((props: { children: ReactNode }, ref) => {
{
onClick={() => {
if (checked)
downloadFlow(
- flows.find((f) => f.id === tabId),
+ flows.find((flow) => flow.id === tabId),
name,
description
);
else
downloadFlow(
- removeApiKeys(flows.find((f) => f.id === tabId)),
+ removeApiKeys(flows.find((flow) => flow.id === tabId)),
name,
description
);
diff --git a/src/frontend/src/modals/flowSettingsModal/index.tsx b/src/frontend/src/modals/flowSettingsModal/index.tsx
index 9dc9d1daa..0005acbc6 100644
--- a/src/frontend/src/modals/flowSettingsModal/index.tsx
+++ b/src/frontend/src/modals/flowSettingsModal/index.tsx
@@ -19,12 +19,16 @@ export default function FlowSettingsModal({
const { flows, tabId, updateFlow, setTabsState, saveFlow } =
useContext(TabsContext);
const maxLength = 50;
- const [name, setName] = useState(flows.find((f) => f.id === tabId).name);
- const [description, setDescription] = useState(
- flows.find((f) => f.id === tabId).description
+ const [name, setName] = useState(
+ flows.find((flow) => flow.id === tabId).name
);
+ const [description, setDescription] = useState(
+ flows.find((flow) => flow.id === tabId).description
+ );
+ const [invalidName, setInvalidName] = useState(false);
+
function handleClick() {
- let savedFlow = flows.find((f) => f.id === tabId);
+ let savedFlow = flows.find((flow) => flow.id === tabId);
savedFlow.name = name;
savedFlow.description = description;
saveFlow(savedFlow);
@@ -39,6 +43,8 @@ export default function FlowSettingsModal({
-
+
Save
diff --git a/src/frontend/src/modals/formModal/chatInput/index.tsx b/src/frontend/src/modals/formModal/chatInput/index.tsx
index e8be2f292..be1e5a45a 100644
--- a/src/frontend/src/modals/formModal/chatInput/index.tsx
+++ b/src/frontend/src/modals/formModal/chatInput/index.tsx
@@ -46,8 +46,8 @@ export default function ChatInput({
}`,
}}
value={lockChat ? "Thinking..." : chatValue}
- onChange={(e) => {
- setChatValue(e.target.value);
+ onChange={(event) => {
+ setChatValue(event.target.value);
}}
className={classNames(
lockChat
diff --git a/src/frontend/src/modals/formModal/chatMessage/index.tsx b/src/frontend/src/modals/formModal/chatMessage/index.tsx
index 08cc2d22b..c132f025a 100644
--- a/src/frontend/src/modals/formModal/chatMessage/index.tsx
+++ b/src/frontend/src/modals/formModal/chatMessage/index.tsx
@@ -88,8 +88,8 @@ export default function ChatMessage({
{props.children}>;
@@ -187,7 +187,7 @@ export default function ChatMessage({
}
/>
-
+
{promptOpen
? template?.split("\n")?.map((line, index) => {
const regex = /{([^}]+)}/g;
diff --git a/src/frontend/src/modals/formModal/index.tsx b/src/frontend/src/modals/formModal/index.tsx
index f0e18a118..68137c0da 100644
--- a/src/frontend/src/modals/formModal/index.tsx
+++ b/src/frontend/src/modals/formModal/index.tsx
@@ -46,7 +46,7 @@ export default function FormModal({
const handleKeys = formKeysData.handle_keys;
const keyToUse = Object.keys(inputKeys).find(
- (k) => !handleKeys.some((j) => j === k) && inputKeys[k] === ""
+ (key) => !handleKeys.some((j) => j === key) && inputKeys[key] === ""
);
return inputKeys[keyToUse];
@@ -67,14 +67,17 @@ export default function FormModal({
const id = useRef(flow.id);
const tabsStateFlowId = tabsState[flow.id];
const tabsStateFlowIdFormKeysData = tabsStateFlowId.formKeysData;
- const [chatKey, setChatKey] = useState(
- Object.keys(tabsState[flow.id].formKeysData.input_keys).find(
- (k) =>
- !tabsState[flow.id].formKeysData.handle_keys.some((j) => j === k) &&
- tabsState[flow.id].formKeysData.input_keys[k] === ""
- )
- );
-
+ const [chatKey, setChatKey] = useState(() => {
+ if (tabsState[flow.id]?.formKeysData?.input_keys) {
+ return Object.keys(tabsState[flow.id].formKeysData.input_keys).find(
+ (key) =>
+ !tabsState[flow.id].formKeysData.handle_keys.some((j) => j === key) &&
+ tabsState[flow.id].formKeysData.input_keys[key] === ""
+ );
+ }
+ // TODO: return a sensible default
+ return "";
+ });
useEffect(() => {
if (messagesRef.current) {
messagesRef.current.scrollTop = messagesRef.current.scrollHeight;
@@ -112,7 +115,6 @@ export default function FormModal({
return newChat;
});
};
-
//add proper type signature for function
function updateLastMessage({
@@ -158,7 +160,7 @@ export default function FormModal({
}
function getWebSocketUrl(chatId, isDevelopment = false) {
- const isSecureProtocol = window.location.protocol === "https:";
+ const isSecureProtocol = window.location.protocol === "https:" || window.location.port === "443";
const webSocketProtocol = isSecureProtocol ? "wss" : "ws";
const host = isDevelopment ? "localhost:7860" : window.location.host;
const chatEndpoint = `/api/v1/chat/${chatId}`;
@@ -371,10 +373,6 @@ export default function FormModal({
if (lockChat) setLockChat(false);
}
- function setModalOpen(x: boolean) {
- setOpen(x);
- }
-
function handleOnCheckedChange(checked: boolean, i: string) {
if (checked === true) {
setChatKey(i);
@@ -424,126 +422,103 @@ export default function FormModal({
- {Object.keys(tabsState[id.current].formKeysData.input_keys).map(
- (i, k) => (
-
+ {tabsState[id.current]?.formKeysData?.input_keys
+ ? Object.keys(
+ tabsState[id.current].formKeysData.input_keys
+ ).map((key, index) => (
+
+
+
+ {key}
+
+
+ {
+ event.stopPropagation();
+ }}
+ >
+
+ handleOnCheckedChange(value, key)
+ }
+ size="small"
+ disabled={tabsState[
+ id.current
+ ].formKeysData.handle_keys.some(
+ (t) => t === key
+ )}
+ />
+
+
+ }
+ key={index}
+ keyValue={key}
+ >
+
+ {tabsState[id.current].formKeysData.handle_keys.some(
+ (t) => t === key
+ ) && (
+
+ Source: Component
+
+ )}
+
+
+
+
+ ))
+ : null}
+ {tabsState[id.current].formKeysData.memory_keys.map(
+ (key, index) => (
+
- {i}
+ {key}
-
- {
- event.stopPropagation();
- }}
- >
+
- handleOnCheckedChange(value, i)
- }
+ enabled={chatKey === key}
+ setEnabled={() => {}}
size="small"
- disabled={tabsState[
- id.current
- ].formKeysData.handle_keys.some((t) => t === i)}
+ disabled={true}
/>
}
- key={k}
- keyValue={i}
+ key={index}
+ keyValue={key}
>
- {tabsState[id.current].formKeysData.handle_keys.some(
- (t) => t === i
- ) && (
-
- Source: Component
-
- )}
-
+
+ Source: Memory
+
)
)}
- {tabsState[id.current].formKeysData.memory_keys.map((i, k) => (
-
-
-
- {i}
-
-
- {
- event.stopPropagation();
- }}
- >
-
- handleOnCheckedChange(value, i)
- }
- size="small"
- disabled={tabsState[
- id.current
- ].formKeysData.handle_keys.some((t) => t === i)}
- />
-
-
- }
- key={k}
- keyValue={i}
- >
-
- {tabsState[id.current].formKeysData.handle_keys.some(
- (t) => t === i
- ) && (
-
- Source: Component
-
- )}
-
-
-
-
- ))}
@@ -563,14 +538,14 @@ export default function FormModal({
{chatHistory.length > 0 ? (
- chatHistory.map((c, i) => (
+ chatHistory.map((chat, index) => (
))
) : (
diff --git a/src/frontend/src/modals/genericModal/index.tsx b/src/frontend/src/modals/genericModal/index.tsx
index df9d32f6d..c3d26700f 100644
--- a/src/frontend/src/modals/genericModal/index.tsx
+++ b/src/frontend/src/modals/genericModal/index.tsx
@@ -208,9 +208,9 @@ export default function GenericModal({
setIsEdit(false);
}}
autoFocus
- onChange={(e) => {
- setInputValue(e.target.value);
- checkVariables(e.target.value);
+ onChange={(event) => {
+ setInputValue(event.target.value);
+ checkVariables(event.target.value);
}}
placeholder="Type message here."
/>
@@ -221,8 +221,8 @@ export default function GenericModal({
ref={ref}
className="form-input h-full w-full rounded-lg focus-visible:ring-1"
value={inputValue}
- onChange={(e) => {
- setInputValue(e.target.value);
+ onChange={(event) => {
+ setInputValue(event.target.value);
}}
placeholder="Type message here."
/>
diff --git a/src/frontend/src/modals/importModal/buttonBox/index.tsx b/src/frontend/src/modals/importModal/buttonBox/index.tsx
index 0a998d56e..68759c8be 100644
--- a/src/frontend/src/modals/importModal/buttonBox/index.tsx
+++ b/src/frontend/src/modals/importModal/buttonBox/index.tsx
@@ -96,7 +96,7 @@ export default function ButtonBox({
{title}
- {buttons.map((x, index) => (
-
- {x.Icon}
+ {buttons.map((btn, index) => (
+
+ {btn.Icon}
))}
diff --git a/src/frontend/src/pages/FlowPage/components/PageComponent/index.tsx b/src/frontend/src/pages/FlowPage/components/PageComponent/index.tsx
index ac9aa1d01..a15b99e58 100644
--- a/src/frontend/src/pages/FlowPage/components/PageComponent/index.tsx
+++ b/src/frontend/src/pages/FlowPage/components/PageComponent/index.tsx
@@ -141,10 +141,10 @@ export default function Page({ flow }: { flow: FlowType }) {
}, [setExtraComponent, setExtraNavigation]);
const onEdgesChangeMod = useCallback(
- (s: EdgeChange[]) => {
- onEdgesChange(s);
- setNodes((x) => {
- let newX = _.cloneDeep(x);
+ (change: EdgeChange[]) => {
+ onEdgesChange(change);
+ setNodes((node) => {
+ let newX = _.cloneDeep(node);
return newX;
});
setTabsState((prev) => {
@@ -161,8 +161,8 @@ export default function Page({ flow }: { flow: FlowType }) {
);
const onNodesChangeMod = useCallback(
- (s: NodeChange[]) => {
- onNodesChange(s);
+ (change: NodeChange[]) => {
+ onNodesChange(change);
setTabsState((prev) => {
return {
...prev,
@@ -193,8 +193,8 @@ export default function Page({ flow }: { flow: FlowType }) {
eds
)
);
- setNodes((x) => {
- let newX = _.cloneDeep(x);
+ setNodes((node) => {
+ let newX = _.cloneDeep(node);
return newX;
});
},
@@ -219,7 +219,7 @@ export default function Page({ flow }: { flow: FlowType }) {
const onDragOver = useCallback((event: React.DragEvent) => {
event.preventDefault();
- if (event.dataTransfer.types.some((t) => t === "nodedata")) {
+ if (event.dataTransfer.types.some((types) => types === "nodedata")) {
event.dataTransfer.dropEffect = "move";
} else {
event.dataTransfer.dropEffect = "copy";
@@ -229,7 +229,7 @@ export default function Page({ flow }: { flow: FlowType }) {
const onDrop = useCallback(
(event: React.DragEvent) => {
event.preventDefault();
- if (event.dataTransfer.types.some((t) => t === "nodedata")) {
+ if (event.dataTransfer.types.some((types) => types === "nodedata")) {
takeSnapshot();
// Get the current bounds of the ReactFlow wrapper element
@@ -281,7 +281,7 @@ export default function Page({ flow }: { flow: FlowType }) {
// Add the new node to the list of nodes in state
}
setNodes((nds) => nds.concat(newNode));
- } else if (event.dataTransfer.types.some((t) => t === "Files")) {
+ } else if (event.dataTransfer.types.some((types) => types === "Files")) {
takeSnapshot();
uploadFlow(false, event.dataTransfer.files.item(0));
}
@@ -303,7 +303,10 @@ export default function Page({ flow }: { flow: FlowType }) {
takeSnapshot();
setEdges(
edges.filter(
- (ns) => !mynodes.some((n) => ns.source === n.id || ns.target === n.id)
+ (edge) =>
+ !mynodes.some(
+ (node) => edge.source === node.id || edge.target === node.id
+ )
)
);
},
@@ -326,7 +329,7 @@ export default function Page({ flow }: { flow: FlowType }) {
const onEdgeUpdateEnd = useCallback((_, edge) => {
if (!edgeUpdateSuccessful.current) {
- setEdges((eds) => eds.filter((e) => e.id !== edge.id));
+ setEdges((eds) => eds.filter((edg) => edg.id !== edge.id));
}
edgeUpdateSuccessful.current = true;
diff --git a/src/frontend/src/pages/FlowPage/components/extraSidebarComponent/index.tsx b/src/frontend/src/pages/FlowPage/components/extraSidebarComponent/index.tsx
index 3111f74a2..e6d5a7010 100644
--- a/src/frontend/src/pages/FlowPage/components/extraSidebarComponent/index.tsx
+++ b/src/frontend/src/pages/FlowPage/components/extraSidebarComponent/index.tsx
@@ -1,4 +1,4 @@
-import { useContext, useState } from "react";
+import { useContext, useEffect, useState } from "react";
import ShadTooltip from "../../../../components/ShadTooltipComponent";
import IconComponent from "../../../../components/genericIconComponent";
import { Input } from "../../../../components/ui/input";
@@ -18,8 +18,8 @@ import { classNames } from "../../../../utils/utils";
import DisclosureComponent from "../DisclosureComponent";
export default function ExtraSidebar() {
- const { data } = useContext(typesContext);
- const { flows, tabId, uploadFlow, tabsState, saveFlow } =
+ const { data, templates } = useContext(typesContext);
+ const { flows, tabId, uploadFlow, tabsState, saveFlow, isBuilt } =
useContext(TabsContext);
const { setSuccessData, setErrorData } = useContext(alertContext);
const [dataFilter, setFilterData] = useState(data);
@@ -56,57 +56,84 @@ export default function ExtraSidebar() {
return ret;
});
}
- const flow = flows.find((f) => f.id === tabId);
+ const flow = flows.find((flow) => flow.id === tabId);
+ useEffect(() => {
+ // show components with error on load
+ let errors = [];
+ Object.keys(templates).forEach((component) => {
+ if (templates[component].error) {
+ errors.push(component);
+ }
+ });
+ if (errors.length > 0)
+ setErrorData({ title: " Components with errors: ", list: errors });
+ }, []);
return (
-
- {
- uploadFlow();
- }}
- >
-
-
-
-
-
+
+
+ {
+ uploadFlow();
+ }}
+ >
+
+
+
+
+
-
-
-
-
-
-
- {flow && flow.data && (
-
+
-
+
-
- )}
+
+
+
+
+
+ {flow && flow.data && (
+
+
+
+
+
+ )}
+
-
-
- {
- saveFlow(flow);
- setSuccessData({ title: "Changes saved successfully" });
- }}
- disabled={!isPending}
- >
-
+
+
-
-
+ onClick={(event) => {
+ saveFlow(flow);
+ setSuccessData({ title: "Changes saved successfully" });
+ }}
+ >
+
+
+
+
@@ -116,10 +143,10 @@ export default function ExtraSidebar() {
id="search"
placeholder="Search"
className="nopan nodrag noundo nocopy input-search"
- onChange={(e) => {
- handleSearchInput(e.target.value);
+ onChange={(event) => {
+ handleSearchInput(event.target.value);
// Set search input state
- setSearch(e.target.value);
+ setSearch(event.target.value);
}}
/>
@@ -134,39 +161,43 @@ export default function ExtraSidebar() {
{Object.keys(dataFilter)
.sort()
- .map((d: keyof APIObjectType, i) =>
- Object.keys(dataFilter[d]).length > 0 ? (
+ .map((SBSectionName: keyof APIObjectType, index) =>
+ Object.keys(dataFilter[SBSectionName]).length > 0 ? (
- {Object.keys(dataFilter[d])
+ {Object.keys(dataFilter[SBSectionName])
.sort()
- .map((t: string, k) => (
+ .map((SBItemName: string, index) => (
-
+
onDragStart(event, {
- type: t,
- node: data[d][t],
+ type: SBItemName,
+ node: data[SBSectionName][SBItemName],
})
}
onDragEnd={() => {
@@ -179,7 +210,7 @@ export default function ExtraSidebar() {
>
- {data[d][t].display_name}
+ {data[SBSectionName][SBItemName].display_name}
) : (
-
+
)
)}
diff --git a/src/frontend/src/pages/FlowPage/components/nodeToolbarComponent/index.tsx b/src/frontend/src/pages/FlowPage/components/nodeToolbarComponent/index.tsx
index 06d8ea06d..e036f0360 100644
--- a/src/frontend/src/pages/FlowPage/components/nodeToolbarComponent/index.tsx
+++ b/src/frontend/src/pages/FlowPage/components/nodeToolbarComponent/index.tsx
@@ -9,17 +9,17 @@ import { classNames } from "../../../../utils/utils";
export default function NodeToolbarComponent({ data, setData, deleteNode }) {
const [nodeLength, setNodeLength] = useState(
Object.keys(data.node.template).filter(
- (t) =>
- t.charAt(0) !== "_" &&
- data.node.template[t].show &&
- (data.node.template[t].type === "str" ||
- data.node.template[t].type === "bool" ||
- data.node.template[t].type === "float" ||
- data.node.template[t].type === "code" ||
- data.node.template[t].type === "prompt" ||
- data.node.template[t].type === "file" ||
- data.node.template[t].type === "Any" ||
- data.node.template[t].type === "int")
+ (templateField) =>
+ templateField.charAt(0) !== "_" &&
+ data.node.template[templateField].show &&
+ (data.node.template[templateField].type === "str" ||
+ data.node.template[templateField].type === "bool" ||
+ data.node.template[templateField].type === "float" ||
+ data.node.template[templateField].type === "code" ||
+ data.node.template[templateField].type === "prompt" ||
+ data.node.template[templateField].type === "file" ||
+ data.node.template[templateField].type === "Any" ||
+ data.node.template[templateField].type === "int")
).length
);
diff --git a/src/frontend/src/style/applies.css b/src/frontend/src/style/applies.css
index b45b9eb6f..34f0f6732 100644
--- a/src/frontend/src/style/applies.css
+++ b/src/frontend/src/style/applies.css
@@ -2,976 +2,1027 @@
@tailwind components;
@tailwind utilities;
-
@layer base {
- * {
- @apply border-border;
- }
-
- body {
- @apply bg-background text-foreground;
- font-feature-settings: "rlig" 1, "calt" 1;
- }
+ * {
+ @apply border-border;
+ }
+
+ body {
+ @apply bg-background text-foreground;
+ font-feature-settings: "rlig" 1, "calt" 1;
+ }
}
-
+
@keyframes slideDown {
- from {
- height: 0;
- }
- to {
- height: var(--radix-accordion-content-height);
- }
+ from {
+ height: 0;
+ }
+ to {
+ height: var(--radix-accordion-content-height);
+ }
}
-
+
@keyframes slideUp {
- from {
- height: var(--radix-accordion-content-height);
- }
- to {
- height: 0;
- }
-}
+ from {
+ height: var(--radix-accordion-content-height);
+ }
+ to {
+ height: 0;
+ }
+}
+
+@keyframes gradient-motion-start {
+ 0% {
+ stop-color: rgb(156, 138, 236);
+ }
+ 50% {
+ stop-color: rgb(255, 130, 184);
+ }
+ 80% {
+ stop-color: rgb(255, 165, 100);
+ }
+ 100% {
+ stop-color: rgb(156, 138, 236);
+ }
+}
+
+@keyframes gradient-motion-end {
+ 0% {
+ stop-color: rgb(156, 138, 236);
+ }
+ 50% {
+ stop-color: rgb(255, 165, 100);
+ }
+ 80% {
+ stop-color: rgb(255, 130, 184);
+ }
+ 100% {
+ stop-color: rgb(156, 138, 236);
+ }
+}
@layer components {
- .round-buttons-position {
- @apply fixed right-4
- }
- .side-bar-arrangement {
- @apply flex h-full w-52 flex-col overflow-hidden border-r scrollbar-hide
- }
- .side-bar-search-div-placement {
- @apply relative mx-auto mb-2 mt-2 flex items-center
- }
- .side-bar-components-icon {
- @apply h-6 w-4 text-ring
- }
- .side-bar-components-text {
- @apply w-full truncate pr-1 text-xs text-foreground
- }
- .side-bar-components-div-form {
- @apply flex w-full items-center justify-between rounded-md rounded-l-none border border-l-0 border-dashed border-ring bg-white px-3 py-1 text-sm
- }
- .side-bar-components-border {
- @apply cursor-grab rounded-l-md border-l-8
- }
- .side-bar-components-gap {
- @apply flex flex-col gap-2 p-2
- }
- .side-bar-components-div-arrangement {
- @apply w-full overflow-auto scrollbar-hide
- }
- .search-icon {
- @apply absolute inset-y-0 right-0 flex items-center py-1.5 pr-5
- }
- .extra-side-bar-save-disable {
- @apply text-muted-foreground
- }
- .extra-side-bar-save-disable:hover {
- @apply hover:text-accent-foreground
- }
- .side-bar-button-size {
- @apply h-5 w-5
- }
- .side-bar-button-size:hover {
- @apply hover:text-accent-foreground
- }
- .side-bar-buttons-arrangement {
- @apply mb-2 mt-2 flex w-full items-center justify-between gap-2 px-2
- }
- .extra-side-bar-buttons {
- @apply relative inline-flex w-full items-center justify-center rounded-md bg-background px-2 py-2 text-foreground shadow-sm ring-1 ring-inset ring-input transition-all duration-500 ease-in-out
- }
- .extra-side-bar-buttons:hover {
- @apply hover:bg-muted
- }
- .button-div-style {
- @apply gap-2 flex
- }
- .input-primary{
- @apply disabled:cursor-not-allowed disabled:opacity-50 focus:placeholder-transparent focus:ring-ring focus:border-ring bg-background block text-left border-border form-input px-3 placeholder:text-muted-foreground rounded-md shadow-sm sm:text-sm w-full truncate
- }
-
- /* The same as input-primary but no-truncate */
- .textarea-primary{
- @apply disabled:cursor-not-allowed disabled:opacity-50 focus:placeholder-transparent focus:ring-ring focus:border-ring bg-background block text-left border-border form-input px-3 placeholder:text-muted-foreground rounded-md shadow-sm sm:text-sm w-full
- }
-
- .input-edit-node{
- @apply input-primary border-border pt-0.5 pb-0.5 text-left w-full
- }
- .input-search{
- @apply input-primary pr-7 mx-2
- }
- .input-disable{
- @apply bg-border placeholder:text-ring border-transparent
- }
- .input-dialog{
- @apply text-ring cursor-pointer bg-transparent
- }
- .message-button {
- @apply message-button-position flex h-12 w-12 items-center justify-center rounded-full bg-border px-3 py-1 shadow-md transition-all
- }
-
- .round-button-form {
- @apply flex h-12 w-12 cursor-pointer justify-center rounded-full bg-border px-3 py-1 shadow-md
- }
- .round-button-div {
- @apply flex items-center gap-3
- }
- .build-trigger-loading-icon {
- @apply stroke-build-trigger
- }
- .build-trigger-icon {
- @apply w-6 fill-build-trigger stroke-1 stroke-build-trigger
- }
- .message-button-position {
- @apply fixed bottom-4 right-4
- }
- .message-button-icon {
- @apply fill-chat-trigger stroke-chat-trigger stroke-1
- }
- .disabled-message-button-icon {
- @apply fill-chat-trigger-disabled stroke-chat-trigger-disabled stroke-1
- }
- .components-disclosure-arrangement {
- @apply -mt-px flex w-full select-none items-center justify-between border-y border-y-input bg-muted px-3 py-2
- }
- .components-disclosure-title {
- @apply flex items-center text-sm text-primary
- }
- .components-disclosure-div {
- @apply flex gap-2
- }
- .flow-page-positioning {
- @apply h-full w-full overflow-hidden
- }
- .logspace-page-icon {
- @apply absolute bottom-2 left-7 flex h-6 cursor-pointer flex-col items-center justify-start overflow-hidden rounded-lg bg-foreground px-2 text-center font-sans text-xs tracking-wide text-secondary transition-all duration-500 ease-in-out
- }
-
- .logspace-page-icon:hover {
- @apply hover:h-12
- }
-
- .flex-max-width {
- @apply flex w-full
- }
-
- .main-page-panel {
- @apply flex-max-width h-full flex-col overflow-auto bg-muted px-16
- }
-
- .main-page-nav-arrangement {
- @apply flex-max-width justify-between px-6 py-12 pb-2
- }
-
- .main-page-nav-title {
- @apply flex items-center justify-center gap-2 text-2xl font-semibold
- }
-
- .main-page-nav-button {
- @apply mr-2 w-4
- }
-
- .main-page-description-text {
- @apply flex w-[60%] px-6 pb-14 text-muted-foreground
- }
-
- .main-page-flows-display {
- @apply grid w-full gap-4 p-4 md:grid-cols-2 lg:grid-cols-4
- }
-
- .community-page-arrangement {
- @apply flex-max-width h-full flex-col overflow-auto bg-muted px-16
- }
-
- .community-page-nav-arrangement {
- @apply flex-max-width justify-between px-6 py-12 pb-2
- }
-
- .community-page-nav-title {
- @apply flex items-center justify-center gap-2 text-2xl font-semibold
- }
-
- .community-page-nav-button {
- @apply flex gap-2
- }
-
- .community-page-description-text {
- @apply flex w-[70%] px-6 pb-8 text-muted-foreground
- }
-
- .community-pages-flows-panel {
- @apply grid w-full gap-4 p-4 md:grid-cols-2 lg:grid-cols-4
- }
- .generic-node-div {
- @apply relative flex w-96 flex-col justify-center rounded-lg bg-background
- }
- .generic-node-div-title {
- @apply flex w-full items-center justify-between gap-8 rounded-t-lg border-b bg-muted p-4
- }
- .generic-node-title-arrangement {
- @apply flex-max-width items-center truncate
- }
- .generic-node-icon {
- @apply h-10 w-10 rounded p-1
- }
- .generic-node-tooltip-div {
- @apply ml-2 truncate
- }
- .generic-node-validation-div {
- @apply max-h-96 overflow-auto
- }
-
- .generic-node-status-position {
- @apply relative top-[3px] h-5 w-5
- }
-
- .generic-node-status-animation {
- @apply hidden h-4 w-4 animate-spin rounded-full bg-ring opacity-0
- }
- .generic-node-status {
- @apply h-4 w-4 rounded-full opacity-100
- }
- .green-status {
- @apply generic-node-status bg-status-green
- }
- .red-status {
- @apply generic-node-status bg-status-red
- }
- .yellow-status {
- @apply generic-node-status bg-status-yellow
- }
- .status-build-animation {
- @apply hidden h-4 w-4 animate-spin rounded-full bg-ring opacity-0
- }
- .status-div {
- @apply absolute w-4 duration-200 ease-in-out
- }
- .status-div:hover {
- @apply hover:text-accent-foreground hover:transition-all
- }
- .generic-node-desc {
- @apply h-full w-full py-5 text-foreground
- }
- .generic-node-desc-text {
- @apply w-full px-5 pb-3 text-sm text-muted-foreground
- }
-
- .alert-icon {
- @apply h-5 w-5
- }
- .alert-font-size {
- @apply text-sm font-medium
- }
-
- .error-build-message {
- @apply mt-6 w-96 cursor-pointer rounded-md bg-error-background p-4 shadow-xl
- }
- .error-build-message-circle {
- @apply text-status-red alert-icon
- }
- .error-build-text {
- @apply text-error-foreground
- }
- .error-build-foreground {
- @apply error-build-text alert-font-size
- }
- .error-build-message-div {
- @apply mt-2 text-sm error-build-text
- }
- .error-build-message-list {
- @apply list-disc space-y-1 pl-5
- }
-
- .success-alert {
- @apply mt-6 w-96 rounded-md bg-success-background p-4 shadow-xl
- }
- .success-alert-icon {
- @apply alert-icon text-status-green
- }
- .success-alert-message {
- @apply alert-font-size text-success-foreground
- }
-
- .card-component-title-display {
- @apply round-button-div flex-max-width
- }
- .card-component-image {
- @apply flex h-7 w-7 items-center justify-center rounded-full text-2xl
- }
- .card-component-title-size {
- @apply inline-block w-full flex-1 break-words truncate-doubleline
- }
- .card-component-delete-button {
- @apply flex self-start
- }
- .card-component-delete-icon {
- @apply h-4 w-4 text-primary opacity-0 transition-all group-hover:opacity-100
- }
- .card-component-desc {
- @apply pb-2 pt-2
- }
- .card-component-desc-text {
- @apply truncate-doubleline
- }
- .card-component-footer-arrangement {
- @apply flex-max-width items-end justify-between gap-2
- }
- .card-component-footer {
- @apply flex flex-wrap gap-2
- }
-
- .unused-side-bar-aside {
- @apply flex flex-shrink-0 flex-col overflow-hidden border-r transition-all duration-500
- }
- .unused-side-bar-arrangement {
- @apply flex h-full w-52 flex-col items-start overflow-y-auto border bg-background scrollbar-hide
- }
- .unused-side-bar-division {
- @apply flex-max-width flex-grow flex-col
- }
- .unused-side-bar-nav {
- @apply flex-1 space-y-1
- }
- .unused-side-bar-link {
- @apply flex-max-width items-center rounded-md py-2 pl-2 text-sm font-medium
- }
- .unused-side-bar-link-colors-true {
- @apply bg-muted text-foreground
- }
- .unused-side-bar-link-colors-false {
- @apply bg-background text-muted-foreground hover:bg-muted hover:text-foreground
- }
- .unused-side-bar-icon {
- @apply mr-3 flex-shrink-0 h-6 w-6
- }
- .unused-side-bar-icon-false {
- @apply text-ring group-hover:text-accent-foreground
- }
- .unused-side-bar-disclosure {
- @apply unused-side-bar-link pr-1 text-left
- }
- .unused-side-bar-disclosure:focus {
- @apply focus:outline-none focus:ring-1 focus:ring-ring
- }
- .unused-side-bar-disclosure-icon {
- @apply unused-side-bar-icon text-ring group-hover:text-accent-foreground
- }
- .unused-side-bar-svg-true {
- @apply text-ring rotate-90
- }
- .unused-side-bar-svg {
- @apply ml-3 h-5 w-5 flex-shrink-0 duration-150 ease-in-out group-hover:text-accent-foreground
- }
- .unused-side-bar-disclosure-panel {
- @apply flex w-full items-center rounded-md py-2 pl-11 pr-2 text-sm font-medium
- }
-
- .code-area-component {
- @apply pointer-events-none w-full cursor-not-allowed
- }
- .code-area-input-positioning {
- @apply flex-max-width items-center
- }
- .code-area-external-link {
- @apply w-6 h-6 ml-3
- }
- .code-area-external-link:hover {
- @apply hover:text-accent-foreground
- }
-
- .dropdown-component-outline {
- @apply input-edit-node relative pr-8
- }
- .dropdown-component-false-outline {
- @apply input-primary py-2 pl-3 pr-10 text-left
- }
- .dropdown-component-display {
- @apply block w-full truncate bg-background
- }
- .dropdown-component-arrow {
- @apply pointer-events-none absolute inset-y-0 right-0 flex items-center pr-2
- }
- .dropdown-component-arrow-color {
- @apply h-5 w-5 extra-side-bar-save-disable
- }
- .dropdown-component-options {
- @apply z-10 mt-1 max-h-60 overflow-auto rounded-md bg-background py-1 text-base shadow-lg ring-1 ring-black ring-opacity-5 focus:outline-none sm:text-sm
- }
- .dropdown-component-true-options {
- @apply dropdown-component-options lg:w-[32%]
- }
- .dropdown-component-false-options {
- @apply dropdown-component-options w-full
- }
- .dropdown-component-option {
- @apply relative cursor-default select-none
- }
- .dropdown-component-false-option {
- @apply dropdown-component-option py-0.5 pl-3 pr-12
- }
- .dropdown-component-true-option {
- @apply dropdown-component-option py-2 pl-3 pr-9
- }
- .dropdown-component-choosal {
- @apply absolute inset-y-0 right-0 flex items-center pr-4
- }
- .dropdown-component-check-icon {
- @apply h-5 w-5 text-black
- }
-
- .edit-flow-arrangement {
- @apply flex justify-between
- }
- .edit-flow-span {
- @apply ml-10 animate-pulse text-status-red
- }
-
- .float-component-pointer {
- @apply pointer-events-none cursor-not-allowed
- }
-
- .header-menu-bar {
- @apply flex items-center gap-0.5 rounded-md px-1.5 py-1 text-sm font-medium
- }
- .header-menu-bar-display {
- @apply flex max-w-[200px] items-center gap-2 cursor-pointer
- }
- .header-menu-flow-name {
- @apply flex-1 truncate
- }
- .header-menu-options {
- @apply mr-2 h-4 w-4
- }
-
- .header-arrangement {
- @apply flex-max-width h-12 items-center justify-between border-border bg-muted
- }
- .header-start-display {
- @apply flex w-96 items-center justify-start gap-2
- }
- .header-end-division {
- @apply flex w-96 justify-end px-2
- }
- .header-end-display {
- @apply ml-auto mr-2 flex items-center gap-5
- }
- .header-github-link-box {
- @apply border border-input h-9 px-3 pr-0 rounded-md inline-flex shadow-sm items-center justify-center
- }
- .header-github-link {
- @apply text-sm font-medium disabled:opacity-50 disabled:pointer-events-none ring-offset-background text-muted-foreground header-github-link-box
- }
- .header-github-link:focus-visible {
- @apply focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2
- }
- .header-github-link:hover {
- @apply hover:bg-accent hover:text-accent-foreground
- }
- .header-github-display {
- @apply -mr-px ml-2 flex h-9 items-center justify-center rounded-md rounded-l-none border bg-background px-2 text-sm
- }
- .header-notifications-box {
- @apply fixed left-0 top-0 h-screen w-screen
- }
- .header-notifications {
- @apply absolute right-[3px] h-1.5 w-1.5 rounded-full bg-destructive
- }
-
- .input-component-div {
- @apply pointer-events-none relative cursor-not-allowed
- }
- .input-component-button {
- @apply absolute inset-y-0 right-0 items-center text-muted-foreground
- }
- .input-component-true-button {
- @apply input-component-button pr-2
- }
- .input-component-false-button {
- @apply input-component-button px-4
- }
- .input-component-true-svg {
- @apply absolute bottom-0.5 right-2 side-bar-button-size
- }
- .input-component-false-svg {
- @apply absolute bottom-2 right-3 side-bar-button-size
- }
-
- .input-file-component {
- @apply flex-max-width items-center
- }
-
- .toggle-component-switch {
- @apply relative inline-flex h-6 w-11 flex-shrink-0 cursor-pointer rounded-full border-2 border-transparent transition-colors duration-200 ease-in-out
- }
- .toggle-component-switch:focus {
- @apply focus:outline-none focus:ring-1 focus:ring-primary focus:ring-offset-1
- }
- .toggle-component-span {
- @apply pointer-events-none relative inline-block h-5 w-5 transform rounded-full shadow ring-0 transition duration-200 ease-in-out
- }
- .toggle-component-second-span {
- @apply absolute inset-0 flex h-full w-full items-center justify-center transition-opacity
- }
-
- .app-div {
- @apply fixed bottom-5 left-5 flex flex-col-reverse
- }
-
- .chat-input-modal-txtarea {
- @apply form-input block w-full rounded-md border-ring pr-10 custom-scroll sm:text-sm
- }
- .chat-input-modal-div {
- @apply absolute bottom-0.5 right-3
- }
- .chat-input-modal-lock {
- @apply side-bar-button-size animate-pulse text-ring
- }
- .chat-input-modal-send {
- @apply side-bar-button-size text-ring hover:text-muted-foreground
- }
-
- .code-block-modal {
- @apply flex items-center justify-between px-4 py-1.5
- }
- .code-block-modal-span {
- @apply text-xs lowercase text-muted-foreground
- }
- .code-block-modal-button {
- @apply flex items-center gap-1.5 rounded bg-none p-1 text-xs text-muted-foreground
- }
-
- .chat-message-modal {
- @apply flex-max-width py-2 pl-2
- }
- .chat-message-modal-div {
- @apply my-3 flex h-8 w-8 items-center justify-center overflow-hidden rounded-full
- }
- .chat-message-modal-img {
- @apply absolute scale-150 transition-opacity duration-500
- }
- .chat-message-modal-display {
- @apply flex-max-width items-center text-start
- }
- .chat-message-modal-text {
- @apply relative inline-block w-full text-start text-sm font-normal text-muted-foreground
- }
- .chat-message-modal-icon-div {
- @apply absolute -left-2 -top-1 cursor-pointer
- }
- .chat-message-modal-thought {
- @apply ml-3 inline-block h-full w-[95%] rounded-md border border-ring bg-muted px-2 pb-3 pt-3 text-start text-muted-foreground chat-message-modal-thought-cursor
- }
- .chat-message-modal-thought-cursor {
- @apply cursor-pointer scrollbar-hide overflow-scroll
- }
- .chat-message-modal-markdown {
- @apply w-full px-4 pb-3 pr-8 pt-3
- }
- .chat-message-modal-markdown-span {
- @apply mt-1 animate-pulse cursor-default
- }
- .chat-message-modal-alert {
- @apply inline-block px-3 text-start text-muted-foreground
- }
-
- .file-card-modal-image-div {
- @apply absolute right-0 top-0 rounded-bl-lg bg-muted px-1 text-sm font-bold text-foreground
- }
- .file-card-modal-image-button {
- @apply px-2 py-1 text-ring
- }
- .file-card-modal-button {
- @apply flex w-1/2 items-center justify-between rounded border border-ring bg-muted px-2 py-2 text-foreground shadow hover:drop-shadow-lg
- }
- .file-card-modal-div {
- @apply mr-2 flex-max-width items-center gap-2 text-current
- }
- .file-card-modal-footer {
- @apply flex flex-col items-start
- }
- .file-card-modal-name {
- @apply truncate text-sm text-current
- }
- .file-card-modal-type {
- @apply truncate text-xs text-ring
- }
-
- .send-message-modal-transition {
- @apply fixed inset-0 bg-black bg-opacity-80 backdrop-blur-sm transition-opacity
- }
- .chat-modal-box {
- @apply fixed inset-0 z-10 overflow-y-auto
- }
- .chat-modal-box-div {
- @apply flex h-full items-end justify-center p-4 text-center sm:items-center sm:p-0
- }
- .chat-modal-dialog-panel {
- @apply relative flex h-[95%] w-[690px] transform flex-col justify-between overflow-hidden rounded-lg bg-background text-left shadow-xl drop-shadow-2xl transition-all
- }
- .chat-modal-dialog-panel-div {
- @apply relative w-full p-4
- }
- .chat-modal-dialog-trash-panel {
- @apply absolute right-10 top-2 z-30 text-muted-foreground hover:text-status-red
- }
- .chat-modal-dialog-x-panel {
- @apply absolute right-2 top-1.5 z-30 text-muted-foreground hover:text-status-red
- }
- .chat-modal-dialog-history {
- @apply flex-max-width h-full flex-col items-center overflow-scroll border-t bg-background scrollbar-hide
- }
- .chat-modal-dialog-span-box {
- @apply flex-max-width h-full flex-col items-center justify-center text-center align-middle
- }
- .chat-modal-dialog-desc {
- @apply w-2/4 rounded-md border border-input bg-muted px-6 py-8
- }
- .chat-modal-input-div {
- @apply flex-max-width flex-col items-center justify-between border-t bg-background p-3
- }
- .chat-modal-input {
- @apply relative mt-1 w-full rounded-md shadow-sm
- }
- .code-area-modal-editor-div {
- @apply mt-2 flex-max-width h-full
- }
- .code-area-modal-editor-box {
- @apply h-[300px] w-full rounded-lg border-[1px] border-ring custom-scroll
- }
-
- .edit-node-modal-variable {
- @apply h-5 w-5 stroke-2 pe-1 text-muted-foreground
- }
- .edit-node-modal-span {
- @apply text-sm font-semibold text-primary
- }
- .edit-node-modal-arrangement {
- @apply flex-max-width h-fit max-h-[400px]
- }
- .edit-node-modal-box {
- @apply w-full rounded-lg border-[1px] border-input bg-background
- }
- .edit-node-modal-table {
- @apply flex h-fit flex-col gap-5
- }
- .edit-node-modal-table-header {
- @apply h-10 border-input text-xs font-medium text-ring
- }
- .edit-node-modal-table-cell {
- @apply p-0 text-center text-sm text-foreground truncate sm:px-3
- }
- .edit-node-modal-second-cell {
- @apply w-[300px] p-0 text-center text-xs text-foreground
- }
-
- .generic-modal-txtarea-div {
- @apply mt-2 flex-max-width h-full
- }
-
- .button-box-modal-div {
- @apply flex transform flex-col items-center justify-center rounded-lg border border-ring text-center shadow hover:scale-105 hover:shadow-lg
- }
-
- .dialog-header-modal-div {
- @apply absolute left-0 top-2 z-50 hidden pl-4 pt-4 sm:block
- }
- .dialog-header-modal-button {
- @apply rounded-sm opacity-70 ring-offset-background transition-opacity hover:opacity-100 focus:outline-none focus:ring-2 focus:ring-ring focus:ring-offset-2
- }
-
- .dialog-modal-examples-div {
- @apply h-full w-full overflow-y-auto scrollbar-hide
- }
- .dialog-modal-example-true {
- @apply mx-auto flex flex-row flex-wrap items-start justify-center overflow-auto
- }
- .dialog-modal-example-false {
- @apply flex flex-row items-center justify-center
- }
- .dialog-modal-button-box-div {
- @apply flex-max-width h-full items-center justify-evenly
- }
- .document-icon {
- @apply h-10 w-10 flex-shrink-0
- }
- .loading-component-div {
- @apply flex items-center justify-center align-middle
- }
- .dialog-modal-footer {
- @apply mt-2 flex-max-width items-center justify-center
- }
- .dialog-modal-footer-link {
- @apply flex items-center justify-center text-muted-foreground
- }
-
- .node-modal-div {
- @apply fixed inset-0 bg-ring bg-opacity-75 transition-opacity
- }
- .node-modal-dialog-arrangement {
- @apply fixed inset-0 z-10 overflow-y-auto
- }
- .node-modal-dialog-div {
- @apply flex h-full items-end justify-center p-4 text-center sm:items-center sm:p-0
- }
- .node-modal-dialog-panel {
- @apply relative flex h-[600px] w-[700px] transform flex-col justify-between overflow-hidden rounded-lg bg-background text-left shadow-xl transition-all sm:my-8
- }
- .node-modal-dialog-panel-div {
- @apply absolute right-0 top-0 z-50 hidden pr-4 pt-4 sm:block
- }
- .node-modal-dialog-button {
- @apply rounded-md text-ring hover:text-accent-foreground
- }
- .node-modal-dialog-icon-div {
- @apply flex-max-width h-full flex-col items-center justify-center
- }
- .node-modal-icon-arrangement {
- @apply z-10 flex-max-width justify-center pb-4 shadow-sm
- }
- .node-modal-icon {
- @apply mt-4 h-10 w-10 rounded p-1
- }
- .node-modal-title-div {
- @apply mt-4 text-center sm:ml-4 sm:text-left
- }
- .node-modal-title {
- @apply text-lg font-medium leading-10 text-foreground
- }
- .node-modal-template-div {
- @apply flex-max-width h-full flex-row items-center justify-center gap-4 bg-input p-4
- }
- .node-modal-template {
- @apply w-full rounded-lg bg-background px-4 shadow sm:p-4
- }
- .node-modal-template-column {
- @apply flex h-full flex-col gap-5
- }
- .node-modal-button-box {
- @apply flex-max-width flex-row-reverse bg-input px-4 pb-3
- }
- .node-modal-button {
- @apply inline-flex w-full justify-center rounded-md border border-transparent bg-status-red px-4 py-2 text-base font-medium text-background shadow-sm hover:bg-ring sm:ml-3 sm:w-auto sm:text-sm
- }
- .node-modal-button:focus {
- @apply focus:outline-none focus:ring-1 focus:ring-ring focus:ring-offset-1
- }
-
- .prompt-modal-icon-box {
- @apply mx-auto mt-4 flex h-12 w-12 flex-shrink-0 items-center justify-center rounded-full bg-almost-light-blue sm:mx-0 sm:h-10 sm:w-10
- }
- .prompt-modal-icon {
- @apply h-6 w-6 text-almost-medium-blue
- }
- .prompt-modal-txtarea-arrangement {
- @apply flex-max-width h-full flex-row items-center justify-center gap-4 overflow-auto bg-accent p-4
- }
- .prompt-modal-txtarea-box {
- @apply h-full w-full overflow-hidden rounded-lg bg-background px-4 py-5 shadow sm:p-6
- }
- .prompt-modal-txtarea {
- @apply form-input h-full w-full rounded-lg border-ring
- }
-
- .txtarea-modal-arrangement {
- @apply flex h-full w-full flex-row items-center justify-center gap-4 bg-input p-4
- }
- .txtarea-modal-box {
- @apply w-full overflow-hidden rounded-lg bg-background px-4 py-5 shadow sm:p-6
- }
- .txtarea-modal-input {
- @apply form-input h-full w-full
- }
-
- .api-modal-tabs {
- @apply lg:w-full h-full flex flex-col overflow-hidden text-center bg-muted rounded-md border sm:w-[75vw]
- }
- .api-modal-tablist-div {
- @apply flex items-center justify-between px-2
- }
- .api-modal-tabs-content {
- @apply overflow-hidden w-full h-full px-4 pb-4 -mt-1
- }
- .api-modal-accordion-display {
- @apply flex w-full h-full mt-2
- }
- .api-modal-table-arrangement {
- @apply flex flex-col gap-5 h-fit
- }
-
- .icons-parameters-comp{
- @apply ml-3 h-6 w-6
- }
-
- .form-modal-lock-true {
- @apply bg-input text-primary
- }
- .form-modal-no-input {
- @apply bg-input text-center text-primary dark:bg-gray-700 dark:text-gray-300
- }
- .form-modal-lock-false {
- @apply bg-white text-primary
- }
- .code-highlight{
- @apply block px-3 py-2 w-full max-h-[64vh] text-sm outline-0 border-0 break-all overflow-y-hidden
- }
-
- .code-nohighlight{
- @apply block px-3 py-2 w-full max-h-[70vh] text-sm outline-0 border-0 break-all overflow-y-hidden
- }
- .form-modal-lockchat {
- @apply form-input focus:ring-ring focus:border-ring block w-full rounded-md border-border p-4 pr-16 custom-scroll sm:text-sm
- }
- .form-modal-send-icon-position {
- @apply absolute bottom-2 right-4
- }
- .form-modal-send-button {
- @apply rounded-md p-2 px-1 transition-all duration-300
- }
- .form-modal-lock-icon {
- @apply ml-1 mr-1 h-5 w-5 animate-pulse
- }
- .form-modal-send-icon {
- @apply mr-2 h-5 w-5 rotate-[44deg]
- }
- .form-modal-play-icon {
- @apply h-5 w-5 mx-1
- }
- .form-modal-chat-position {
- @apply flex-max-width px-2 py-6 pl-4 pr-9
- }
- .form-modal-chatbot-icon {
- @apply mb-3 ml-3 mr-6 mt-1
- }
- .form-modal-chat-image {
- @apply flex flex-col items-center gap-1
- }
- .form-modal-chat-img-box {
- @apply relative flex h-8 w-8 items-center justify-center overflow-hidden rounded-md p-5 text-2xl
- }
- .form-modal-chat-bot-icon {
- @apply form-modal-chat-img-box bg-chat-bot-icon
- }
- .form-modal-chat-user-icon {
- @apply form-modal-chat-img-box bg-chat-user-icon
- }
- .form-modal-chat-icon-img {
- @apply absolute scale-[60%]
- }
- .form-modal-chat-text-position {
- @apply flex w-full flex-1 text-start
- }
- .form-modal-chat-text {
- @apply relative flex w-full flex-col text-start text-sm font-normal text-muted-foreground
- }
- .form-modal-chat-icon-div {
- @apply absolute -left-6 -top-3 cursor-pointer
- }
- .form-modal-chat-icon {
- @apply h-4 w-4 animate-bounce
- }
- .form-modal-chat-thought-border {
- @apply rounded-md border border-ring/60
- }
- .form-modal-chat-thought-size {
- @apply inline-block h-full w-[95%]
- }
- .form-modal-chat-thought {
- @apply cursor-pointer overflow-scroll bg-background text-start text-primary scrollbar-hide form-modal-chat-thought-border form-modal-chat-thought-size py-2 px-2
- }
- .form-modal-markdown-span {
- @apply mt-1 animate-pulse cursor-default
- }
- .form-modal-initial-prompt-btn {
- @apply mb-2 flex items-center gap-2 rounded-md border border-border bg-background shadow-sm px-4 py-2 text-sm font-semibold
- }
- .form-modal-iv-box {
- @apply mt-2 flex-max-width h-[80vh]
- }
- .form-modal-iv-size {
- @apply mr-6 flex h-full w-2/6 flex-col justify-start overflow-auto scrollbar-hide
- }
- .file-component-arrangement {
- @apply flex items-center py-2
- }
- .file-component-variable {
- @apply -ml-px mr-1 h-4 w-4 text-primary
- }
- .file-component-variables-span {
- @apply font-semibold text-primary
- }
- .file-component-variables-title {
- @apply flex items-center justify-between pt-2
- }
- .file-component-variables-div {
- @apply mr-2.5 flex items-center
- }
- .file-component-variables-title-txt {
- @apply text-sm font-medium text-primary
- }
- .file-component-accordion-div {
- @apply flex items-start gap-3
- }
- .file-component-badge-div {
- @apply flex-max-width items-center justify-between
- }
- .file-component-tab-column {
- @apply flex flex-col gap-2 p-1
- }
- .tab-accordion-badge-div {
- @apply flex flex-1 items-center justify-between py-4 text-sm font-normal text-muted-foreground transition-all
- }
- .eraser-column-arrangement {
- @apply flex-max-width flex-1 flex-col
- }
- .eraser-size {
- @apply relative flex h-full w-full flex-col rounded-md border bg-muted
- }
- .eraser-position {
- @apply absolute right-3 top-3 z-50
- }
- .chat-message-div {
- @apply flex-max-width h-full flex-col items-center overflow-scroll scrollbar-hide
- }
- .chat-alert-box {
- @apply flex-max-width h-full flex-col items-center justify-center text-center align-middle
- }
- .langflow-chat-span {
- @apply text-lg text-foreground
- }
- .langflow-chat-desc {
- @apply w-2/4 rounded-md border border-border bg-muted px-6 py-8
- }
- .langflow-chat-desc-span {
- @apply text-base text-muted-foreground
- }
- .langflow-chat-input-div {
- @apply flex-max-width flex-col items-center justify-between px-8 pb-6
- }
- .langflow-chat-input {
- @apply relative w-full rounded-md shadow-sm
- }
-
- .tooltip-fixed-width{
- @apply max-w-[30vw] max-h-[20vh] overflow-auto
- }
-
- .ace-editor-arrangement {
- @apply flex-max-width h-full flex-col transition-all
- }
- .ace-editor {
- @apply h-full w-full rounded-lg border-[1px] border-border custom-scroll
- }
- .ace-editor-save-btn {
- @apply flex-max-width h-fit justify-end
- }
-
- .export-modal-save-api {
- @apply font-medium leading-none peer-disabled:cursor-not-allowed peer-disabled:opacity-70
- }
-
- .chat-message-highlight {
- @apply px-0.5 rounded-md bg-indigo-100 dark:bg-indigo-900
- }
- }
\ No newline at end of file
+ .round-buttons-position {
+ @apply fixed right-4;
+ }
+ .side-bar-arrangement {
+ @apply flex h-full w-52 flex-col overflow-hidden border-r scrollbar-hide;
+ }
+ .side-bar-search-div-placement {
+ @apply relative mx-auto mb-2 mt-2 flex items-center;
+ }
+ .side-bar-components-icon {
+ @apply h-6 w-4 text-ring;
+ }
+ .side-bar-components-text {
+ @apply w-full truncate pr-1 text-xs text-foreground;
+ }
+ .side-bar-components-div-form {
+ @apply flex w-full items-center justify-between rounded-md rounded-l-none border border-l-0 border-dashed border-ring bg-white px-3 py-1 text-sm;
+ }
+ .side-bar-components-border {
+ @apply cursor-grab rounded-l-md border-l-8;
+ }
+ .side-bar-components-gap {
+ @apply flex flex-col gap-2 p-2;
+ }
+ .side-bar-components-div-arrangement {
+ @apply w-full overflow-auto scrollbar-hide pb-10;
+ }
+ .search-icon {
+ @apply absolute inset-y-0 right-0 flex items-center py-1.5 pr-5;
+ }
+ .extra-side-bar-save-disable {
+ @apply text-muted-foreground;
+ }
+ .extra-side-bar-save-disable:hover {
+ @apply hover:text-accent-foreground;
+ }
+ .side-bar-button-size {
+ @apply h-5 w-5;
+ }
+ .side-bar-button-size:hover {
+ @apply hover:text-accent-foreground;
+ }
+ .side-bar-buttons-arrangement {
+ @apply mb-2 mt-2 flex w-full items-center justify-between gap-2 px-2;
+ }
+ .side-bar-button {
+ @apply flex w-full;
+ }
+ .button-disable {
+ @apply pointer-events-none;
+ }
+ .extra-side-bar-buttons {
+ @apply relative inline-flex w-full items-center justify-center rounded-md bg-background px-2 py-2 text-foreground shadow-sm ring-1 ring-inset ring-input transition-all duration-500 ease-in-out;
+ }
+ .extra-side-bar-buttons:hover {
+ @apply hover:bg-muted;
+ }
+ .button-div-style {
+ @apply flex gap-2;
+ }
+ .primary-input {
+ @apply form-input block w-full truncate rounded-md border-border bg-background px-3 text-left shadow-sm placeholder:text-muted-foreground focus:border-ring focus:placeholder-transparent focus:ring-ring disabled:cursor-not-allowed disabled:opacity-50 sm:text-sm;
+ }
+
+ /* The same as primary-input but no-truncate */
+ .textarea-primary {
+ @apply form-input block w-full rounded-md border-border bg-background px-3 text-left shadow-sm placeholder:text-muted-foreground focus:border-ring focus:placeholder-transparent focus:ring-ring disabled:cursor-not-allowed disabled:opacity-50 sm:text-sm;
+ }
+
+ .input-edit-node {
+ @apply primary-input w-full pb-0.5 pt-0.5 text-left;
+ }
+ .input-search {
+ @apply primary-input mx-2 pr-7;
+ }
+ .input-disable {
+ @apply border-transparent bg-border placeholder:text-ring;
+ }
+ .input-dialog {
+ @apply cursor-pointer bg-transparent text-ring;
+ }
+ .message-button {
+ @apply message-button-position flex h-12 w-12 items-center justify-center rounded-full bg-border px-3 py-1 shadow-md transition-all;
+ }
+
+ .round-button-form {
+ @apply flex h-12 w-12 cursor-pointer justify-center rounded-full bg-border px-3 py-1 shadow-md;
+ }
+ .round-button-div {
+ @apply flex items-center gap-3;
+ }
+ .build-trigger-loading-icon {
+ @apply stroke-build-trigger;
+ }
+ .build-trigger-icon {
+ @apply w-6 fill-build-trigger stroke-build-trigger stroke-1;
+ }
+ .message-button-position {
+ @apply fixed bottom-4 right-4;
+ }
+ .message-button-icon {
+ @apply fill-chat-trigger stroke-chat-trigger stroke-1;
+ }
+ .disabled-message-button-icon {
+ @apply fill-chat-trigger-disabled stroke-chat-trigger-disabled stroke-1;
+ }
+ .components-disclosure-arrangement {
+ @apply -mt-px flex w-full select-none items-center justify-between border-y border-y-input bg-muted px-3 py-2;
+ }
+ .components-disclosure-title {
+ @apply flex items-center text-sm text-primary;
+ }
+ .components-disclosure-div {
+ @apply flex gap-2;
+ }
+ .flow-page-positioning {
+ @apply h-full w-full overflow-hidden;
+ }
+ .logspace-page-icon {
+ @apply absolute bottom-2 left-7 flex h-6 cursor-pointer flex-col items-center justify-start overflow-hidden rounded-lg bg-foreground px-2 text-center font-sans text-xs tracking-wide text-secondary transition-all duration-500 ease-in-out;
+ }
+
+ .logspace-page-icon:hover {
+ @apply hover:h-12;
+ }
+
+ .flex-max-width {
+ @apply flex w-full;
+ }
+
+ .main-page-panel {
+ @apply flex-max-width h-full flex-col overflow-auto bg-muted px-16;
+ }
+
+ .main-page-nav-arrangement {
+ @apply flex-max-width justify-between px-6 py-12 pb-2;
+ }
+
+ .main-page-nav-title {
+ @apply flex items-center justify-center gap-2 text-2xl font-semibold;
+ }
+
+ .main-page-nav-button {
+ @apply mr-2 w-4;
+ }
+
+ .main-page-description-text {
+ @apply flex w-[60%] px-6 pb-14 text-muted-foreground;
+ }
+
+ .main-page-flows-display {
+ @apply grid w-full gap-4 p-4 md:grid-cols-2 lg:grid-cols-4;
+ }
+
+ .community-page-arrangement {
+ @apply flex-max-width h-full flex-col overflow-auto bg-muted px-16;
+ }
+
+ .community-page-nav-arrangement {
+ @apply flex-max-width justify-between px-6 py-12 pb-2;
+ }
+
+ .community-page-nav-title {
+ @apply flex items-center justify-center gap-2 text-2xl font-semibold;
+ }
+
+ .community-page-nav-button {
+ @apply flex gap-2;
+ }
+
+ .community-page-description-text {
+ @apply flex w-[70%] px-6 pb-8 text-muted-foreground;
+ }
+
+ .community-pages-flows-panel {
+ @apply grid w-full gap-4 p-4 md:grid-cols-2 lg:grid-cols-4;
+ }
+ .generic-node-div {
+ @apply relative flex w-96 flex-col justify-center rounded-lg bg-background;
+ }
+ .generic-node-div-title {
+ @apply flex w-full items-center justify-between gap-8 rounded-t-lg border-b bg-muted p-4;
+ }
+ .generic-node-title-arrangement {
+ @apply flex-max-width items-center truncate;
+ }
+ .generic-node-icon {
+ @apply h-10 w-10 rounded p-1;
+ }
+ .generic-node-tooltip-div {
+ @apply ml-2 truncate;
+ }
+ .generic-node-validation-div {
+ @apply max-h-96 overflow-auto;
+ }
+
+ .generic-node-status-position {
+ @apply relative top-[3px] h-5 w-5;
+ }
+
+ .generic-node-status-animation {
+ @apply hidden h-4 w-4 animate-spin rounded-full bg-ring opacity-0;
+ }
+ .generic-node-status {
+ @apply h-4 w-4 rounded-full opacity-100;
+ }
+ .green-status {
+ @apply generic-node-status bg-status-green;
+ }
+ .red-status {
+ @apply generic-node-status bg-status-red;
+ }
+ .yellow-status {
+ @apply generic-node-status bg-status-yellow;
+ }
+ .status-build-animation {
+ @apply hidden h-4 w-4 animate-spin rounded-full bg-ring opacity-0;
+ }
+ .status-div {
+ @apply absolute w-4 duration-200 ease-in-out;
+ }
+ .status-div:hover {
+ @apply hover:text-accent-foreground hover:transition-all;
+ }
+ .generic-node-desc {
+ @apply h-full w-full py-5 text-foreground;
+ }
+ .generic-node-desc-text {
+ @apply w-full px-5 pb-3 text-sm text-muted-foreground;
+ }
+
+ .alert-icon {
+ @apply h-5 w-5;
+ }
+ .alert-font-size {
+ @apply text-sm font-medium;
+ }
+
+ .error-build-message {
+ @apply mt-6 w-96 cursor-pointer rounded-md bg-error-background p-4 shadow-xl;
+ }
+ .error-build-message-circle {
+ @apply alert-icon text-status-red;
+ }
+ .error-build-text {
+ @apply text-error-foreground word-break-break-word;
+ }
+ .error-build-foreground {
+ @apply error-build-text alert-font-size;
+ }
+ .error-build-message-div {
+ @apply error-build-text mt-2 text-sm;
+ }
+ .error-build-message-list {
+ @apply list-disc space-y-1 pl-5;
+ }
+
+ .success-alert {
+ @apply mt-6 w-96 rounded-md bg-success-background p-4 shadow-xl;
+ }
+ .success-alert-icon {
+ @apply alert-icon text-status-green;
+ }
+ .success-alert-message {
+ @apply word-break-break-word alert-font-size text-success-foreground;
+ }
+
+ .card-component-title-display {
+ @apply round-button-div flex-max-width;
+ }
+ .card-component-image {
+ @apply flex h-7 w-7 items-center justify-center rounded-full text-2xl;
+ }
+ .card-component-title-size {
+ @apply w-full flex-1 word-break-break-word truncate-doubleline;
+ }
+ .card-component-delete-button {
+ @apply flex self-start;
+ }
+ .card-component-delete-icon {
+ @apply h-4 w-4 text-primary opacity-0 transition-all group-hover:opacity-100;
+ }
+ .card-component-desc {
+ @apply pb-2 pt-2;
+ }
+ .card-component-desc-text {
+ @apply truncate-doubleline;
+ }
+ .card-component-footer-arrangement {
+ @apply flex-max-width items-end justify-between gap-2;
+ }
+ .card-component-footer {
+ @apply flex flex-wrap gap-2;
+ }
+
+ .unused-side-bar-aside {
+ @apply flex flex-shrink-0 flex-col overflow-hidden border-r transition-all duration-500;
+ }
+ .unused-side-bar-arrangement {
+ @apply flex h-full w-52 flex-col items-start overflow-y-auto border bg-background scrollbar-hide;
+ }
+ .unused-side-bar-division {
+ @apply flex-max-width flex-grow flex-col;
+ }
+ .unused-side-bar-nav {
+ @apply flex-1 space-y-1;
+ }
+ .unused-side-bar-link {
+ @apply flex-max-width items-center rounded-md py-2 pl-2 text-sm font-medium;
+ }
+ .unused-side-bar-link-colors-true {
+ @apply bg-muted text-foreground;
+ }
+ .unused-side-bar-link-colors-false {
+ @apply bg-background text-muted-foreground hover:bg-muted hover:text-foreground;
+ }
+ .unused-side-bar-icon {
+ @apply mr-3 h-6 w-6 flex-shrink-0;
+ }
+ .unused-side-bar-icon-false {
+ @apply text-ring group-hover:text-accent-foreground;
+ }
+ .unused-side-bar-disclosure {
+ @apply unused-side-bar-link pr-1 text-left;
+ }
+ .unused-side-bar-disclosure:focus {
+ @apply focus:outline-none focus:ring-1 focus:ring-ring;
+ }
+ .unused-side-bar-disclosure-icon {
+ @apply unused-side-bar-icon text-ring group-hover:text-accent-foreground;
+ }
+ .unused-side-bar-svg-true {
+ @apply rotate-90 text-ring;
+ }
+ .unused-side-bar-svg {
+ @apply ml-3 h-5 w-5 flex-shrink-0 duration-150 ease-in-out group-hover:text-accent-foreground;
+ }
+ .unused-side-bar-disclosure-panel {
+ @apply flex w-full items-center rounded-md py-2 pl-11 pr-2 text-sm font-medium;
+ }
+
+ .code-area-component {
+ @apply pointer-events-none w-full cursor-not-allowed;
+ }
+ .code-area-input-positioning {
+ @apply flex-max-width items-center;
+ }
+ .code-area-external-link {
+ @apply ml-3 h-6 w-6;
+ }
+ .code-area-external-link:hover {
+ @apply hover:text-accent-foreground;
+ }
+
+ .dropdown-component-outline {
+ @apply input-edit-node relative pr-8;
+ }
+ .dropdown-component-false-outline {
+ @apply primary-input py-2 pl-3 pr-10 text-left;
+ }
+ .dropdown-component-display {
+ @apply block w-full truncate bg-background;
+ }
+ .dropdown-component-arrow {
+ @apply pointer-events-none absolute inset-y-0 right-0 flex items-center pr-2;
+ }
+ .dropdown-component-arrow-color {
+ @apply extra-side-bar-save-disable h-5 w-5;
+ }
+ .dropdown-component-options {
+ @apply z-10 mt-1 max-h-60 overflow-auto rounded-md bg-background py-1 text-base shadow-lg ring-1 ring-black ring-opacity-5 focus:outline-none sm:text-sm;
+ }
+ .dropdown-component-true-options {
+ @apply dropdown-component-options lg:w-[32%];
+ }
+ .dropdown-component-false-options {
+ @apply dropdown-component-options w-full;
+ }
+ .dropdown-component-option {
+ @apply relative cursor-default select-none;
+ }
+ .dropdown-component-false-option {
+ @apply dropdown-component-option py-0.5 pl-3 pr-12;
+ }
+ .dropdown-component-true-option {
+ @apply dropdown-component-option py-2 pl-3 pr-9;
+ }
+ .dropdown-component-choosal {
+ @apply absolute inset-y-0 right-0 flex items-center pr-4;
+ }
+ .dropdown-component-check-icon {
+ @apply h-5 w-5 text-black;
+ }
+
+ .edit-flow-arrangement {
+ @apply flex justify-between;
+ }
+ .edit-flow-span {
+ @apply ml-10 animate-pulse text-status-red;
+ }
+
+ .float-component-pointer {
+ @apply pointer-events-none cursor-not-allowed;
+ }
+
+ .header-menu-bar {
+ @apply flex items-center gap-0.5 rounded-md px-1.5 py-1 text-sm font-medium;
+ }
+ .header-menu-bar-display {
+ @apply flex max-w-[200px] cursor-pointer items-center gap-2;
+ }
+ .header-menu-flow-name {
+ @apply flex-1 truncate;
+ }
+ .header-menu-options {
+ @apply mr-2 h-4 w-4;
+ }
+
+ .header-arrangement {
+ @apply flex-max-width h-12 items-center justify-between border-border bg-muted;
+ }
+ .header-start-display {
+ @apply flex w-96 items-center justify-start gap-2;
+ }
+ .header-end-division {
+ @apply flex w-96 justify-end px-2;
+ }
+ .header-end-display {
+ @apply ml-auto mr-2 flex items-center gap-5;
+ }
+ .header-github-link-box {
+ @apply inline-flex h-9 items-center justify-center rounded-md border border-input px-3 pr-0 shadow-sm;
+ }
+ .header-waitlist-link-box {
+ @apply inline-flex h-9 items-center justify-center rounded-md border border-input px-2 shadow-sm text-sm font-medium text-muted-foreground ring-offset-background disabled:pointer-events-none disabled:opacity-50 whitespace-nowrap;
+ }
+ .header-waitlist-link-box:hover {
+ @apply hover:bg-accent hover:text-accent-foreground;
+ }
+ .header-github-link {
+ @apply header-github-link-box text-sm font-medium text-muted-foreground ring-offset-background disabled:pointer-events-none disabled:opacity-50;
+ }
+ .header-github-link:focus-visible {
+ @apply focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2;
+ }
+ .header-github-link:hover {
+ @apply hover:bg-accent hover:text-accent-foreground;
+ }
+ .header-github-display {
+ @apply -mr-px ml-2 flex h-9 items-center justify-center rounded-md rounded-l-none border bg-background px-2 text-sm;
+ }
+ .header-notifications-box {
+ @apply fixed left-0 top-0 h-screen w-screen;
+ }
+ .header-notifications {
+ @apply absolute right-[3px] h-1.5 w-1.5 rounded-full bg-destructive;
+ }
+
+ .input-component-div {
+ @apply pointer-events-none relative cursor-not-allowed;
+ }
+ .input-component-button {
+ @apply absolute inset-y-0 right-0 items-center text-muted-foreground;
+ }
+ .input-component-true-button {
+ @apply input-component-button pr-2;
+ }
+ .input-component-false-button {
+ @apply input-component-button px-4;
+ }
+ .input-component-true-svg {
+ @apply side-bar-button-size absolute bottom-0.5 right-2;
+ }
+ .input-component-false-svg {
+ @apply side-bar-button-size absolute bottom-2 right-3;
+ }
+
+ .input-file-component {
+ @apply flex-max-width items-center;
+ }
+
+ .toggle-component-switch {
+ @apply relative inline-flex h-6 w-11 flex-shrink-0 cursor-pointer rounded-full border-2 border-transparent transition-colors duration-200 ease-in-out;
+ }
+ .toggle-component-switch:focus {
+ @apply focus:outline-none focus:ring-1 focus:ring-primary focus:ring-offset-1;
+ }
+ .toggle-component-span {
+ @apply pointer-events-none relative h-5 w-5 transform rounded-full shadow ring-0 transition duration-200 ease-in-out;
+ }
+ .toggle-component-second-span {
+ @apply absolute inset-0 flex h-full w-full items-center justify-center transition-opacity;
+ }
+
+ .app-div {
+ @apply fixed bottom-5 left-5 flex flex-col-reverse;
+ }
+
+ .chat-input-modal-txtarea {
+ @apply form-input block w-full rounded-md border-ring pr-10 custom-scroll sm:text-sm;
+ }
+ .chat-input-modal-div {
+ @apply absolute bottom-0.5 right-3;
+ }
+ .chat-input-modal-lock {
+ @apply side-bar-button-size animate-pulse text-ring;
+ }
+ .chat-input-modal-send {
+ @apply side-bar-button-size text-ring hover:text-muted-foreground;
+ }
+
+ .code-block-modal {
+ @apply flex items-center justify-between px-4 py-1.5;
+ }
+ .code-block-modal-span {
+ @apply text-xs lowercase text-muted-foreground;
+ }
+ .code-block-modal-button {
+ @apply flex items-center gap-1.5 rounded bg-none p-1 text-xs text-muted-foreground;
+ }
+
+ .chat-message-modal {
+ @apply flex-max-width py-2 pl-2;
+ }
+ .chat-message-modal-div {
+ @apply my-3 flex h-8 w-8 items-center justify-center overflow-hidden rounded-full;
+ }
+ .chat-message-modal-img {
+ @apply absolute scale-150 transition-opacity duration-500;
+ }
+ .chat-message-modal-display {
+ @apply flex-max-width items-center text-start;
+ }
+ .chat-message-modal-text {
+ @apply relative w-full text-start text-sm font-normal text-muted-foreground;
+ }
+ .chat-message-modal-icon-div {
+ @apply absolute -left-2 -top-1 cursor-pointer;
+ }
+ .chat-message-modal-thought {
+ @apply chat-message-modal-thought-cursor ml-3 h-full w-[95%] rounded-md border border-ring bg-muted px-2 pb-3 pt-3 text-start text-muted-foreground;
+ }
+ .chat-message-modal-thought-cursor {
+ @apply cursor-pointer overflow-scroll scrollbar-hide;
+ }
+ .chat-message-modal-markdown {
+ @apply w-full px-4 pb-3 pr-8 pt-3;
+ }
+ .chat-message-modal-markdown-span {
+ @apply mt-1 animate-pulse cursor-default;
+ }
+ .chat-message-modal-alert {
+ @apply px-3 text-start text-muted-foreground;
+ }
+
+ .file-card-modal-image-div {
+ @apply absolute right-0 top-0 rounded-bl-lg bg-muted px-1 text-sm font-bold text-foreground;
+ }
+ .file-card-modal-image-button {
+ @apply px-2 py-1 text-ring;
+ }
+ .file-card-modal-button {
+ @apply flex w-1/2 items-center justify-between rounded border border-ring bg-muted px-2 py-2 text-foreground shadow hover:drop-shadow-lg;
+ }
+ .file-card-modal-div {
+ @apply flex-max-width mr-2 items-center gap-2 text-current;
+ }
+ .file-card-modal-footer {
+ @apply flex flex-col items-start;
+ }
+ .file-card-modal-name {
+ @apply truncate text-sm text-current;
+ }
+ .file-card-modal-type {
+ @apply truncate text-xs text-ring;
+ }
+
+ .send-message-modal-transition {
+ @apply fixed inset-0 bg-black bg-opacity-80 backdrop-blur-sm transition-opacity;
+ }
+ .chat-modal-box {
+ @apply fixed inset-0 z-10 overflow-y-auto;
+ }
+ .chat-modal-box-div {
+ @apply flex h-full items-end justify-center p-4 text-center sm:items-center sm:p-0;
+ }
+ .chat-modal-dialog-panel {
+ @apply relative flex h-[95%] w-[690px] transform flex-col justify-between overflow-hidden rounded-lg bg-background text-left shadow-xl drop-shadow-2xl transition-all;
+ }
+ .chat-modal-dialog-panel-div {
+ @apply relative w-full p-4;
+ }
+ .chat-modal-dialog-trash-panel {
+ @apply absolute right-10 top-2 z-30 text-muted-foreground hover:text-status-red;
+ }
+ .chat-modal-dialog-x-panel {
+ @apply absolute right-2 top-1.5 z-30 text-muted-foreground hover:text-status-red;
+ }
+ .chat-modal-dialog-history {
+ @apply flex-max-width h-full flex-col items-center overflow-scroll border-t bg-background scrollbar-hide;
+ }
+ .chat-modal-dialog-span-box {
+ @apply flex-max-width h-full flex-col items-center justify-center text-center align-middle;
+ }
+ .chat-modal-dialog-desc {
+ @apply w-2/4 rounded-md border border-input bg-muted px-6 py-8;
+ }
+ .chat-modal-input-div {
+ @apply flex-max-width flex-col items-center justify-between border-t bg-background p-3;
+ }
+ .chat-modal-input {
+ @apply relative mt-1 w-full rounded-md shadow-sm;
+ }
+ .code-area-modal-editor-div {
+ @apply flex-max-width mt-2 h-full;
+ }
+ .code-area-modal-editor-box {
+ @apply h-[300px] w-full rounded-lg border-[1px] border-ring custom-scroll;
+ }
+
+ .edit-node-modal-variable {
+ @apply h-5 w-5 stroke-2 pe-1 text-muted-foreground;
+ }
+ .edit-node-modal-span {
+ @apply text-sm font-semibold text-primary;
+ }
+ .edit-node-modal-arrangement {
+ @apply flex-max-width h-fit max-h-[400px];
+ }
+ .edit-node-modal-box {
+ @apply w-full rounded-lg border-[1px] border-input bg-background;
+ }
+ .edit-node-modal-table {
+ @apply flex h-fit flex-col gap-5;
+ }
+ .edit-node-modal-table-header {
+ @apply h-10 border-input text-xs font-medium text-ring;
+ }
+ .edit-node-modal-table-cell {
+ @apply truncate p-0 text-center text-sm text-foreground sm:px-3;
+ }
+ .edit-node-modal-second-cell {
+ @apply w-[300px] p-0 text-center text-xs text-foreground;
+ }
+
+ .generic-modal-txtarea-div {
+ @apply flex-max-width mt-2 h-full;
+ }
+
+ .button-box-modal-div {
+ @apply flex transform flex-col items-center justify-center rounded-lg border border-ring text-center shadow hover:scale-105 hover:shadow-lg;
+ }
+
+ .dialog-header-modal-div {
+ @apply absolute left-0 top-2 z-50 hidden pl-4 pt-4 sm:block;
+ }
+ .dialog-header-modal-button {
+ @apply rounded-sm opacity-70 ring-offset-background transition-opacity hover:opacity-100 focus:outline-none focus:ring-2 focus:ring-ring focus:ring-offset-2;
+ }
+
+ .dialog-modal-examples-div {
+ @apply h-full w-full overflow-y-auto scrollbar-hide;
+ }
+ .dialog-modal-example-true {
+ @apply mx-auto flex flex-row flex-wrap items-start justify-center overflow-auto;
+ }
+ .dialog-modal-example-false {
+ @apply flex flex-row items-center justify-center;
+ }
+ .dialog-modal-button-box-div {
+ @apply flex-max-width h-full items-center justify-evenly;
+ }
+ .document-icon {
+ @apply h-10 w-10 flex-shrink-0;
+ }
+ .loading-component-div {
+ @apply flex items-center justify-center align-middle;
+ }
+ .dialog-modal-footer {
+ @apply flex-max-width mt-2 items-center justify-center;
+ }
+ .dialog-modal-footer-link {
+ @apply flex items-center justify-center text-muted-foreground;
+ }
+
+ .node-modal-div {
+ @apply fixed inset-0 bg-ring bg-opacity-75 transition-opacity;
+ }
+ .node-modal-dialog-arrangement {
+ @apply fixed inset-0 z-10 overflow-y-auto;
+ }
+ .node-modal-dialog-div {
+ @apply flex h-full items-end justify-center p-4 text-center sm:items-center sm:p-0;
+ }
+ .node-modal-dialog-panel {
+ @apply relative flex h-[600px] w-[700px] transform flex-col justify-between overflow-hidden rounded-lg bg-background text-left shadow-xl transition-all sm:my-8;
+ }
+ .node-modal-dialog-panel-div {
+ @apply absolute right-0 top-0 z-50 hidden pr-4 pt-4 sm:block;
+ }
+ .node-modal-dialog-button {
+ @apply rounded-md text-ring hover:text-accent-foreground;
+ }
+ .node-modal-dialog-icon-div {
+ @apply flex-max-width h-full flex-col items-center justify-center;
+ }
+ .node-modal-icon-arrangement {
+ @apply flex-max-width z-10 justify-center pb-4 shadow-sm;
+ }
+ .node-modal-icon {
+ @apply mt-4 h-10 w-10 rounded p-1;
+ }
+ .node-modal-title-div {
+ @apply mt-4 text-center sm:ml-4 sm:text-left;
+ }
+ .node-modal-title {
+ @apply text-lg font-medium leading-10 text-foreground;
+ }
+ .node-modal-template-div {
+ @apply flex-max-width h-full flex-row items-center justify-center gap-4 bg-input p-4;
+ }
+ .node-modal-template {
+ @apply w-full rounded-lg bg-background px-4 shadow sm:p-4;
+ }
+ .node-modal-template-column {
+ @apply flex h-full flex-col gap-5;
+ }
+ .node-modal-button-box {
+ @apply flex-max-width flex-row-reverse bg-input px-4 pb-3;
+ }
+ .link-color {
+ @apply font-semibold text-foreground;
+ }
+ .node-modal-button {
+ @apply inline-flex w-full justify-center rounded-md border border-transparent bg-status-red px-4 py-2 text-base font-medium text-background shadow-sm hover:bg-ring sm:ml-3 sm:w-auto sm:text-sm;
+ }
+ .node-modal-button:focus {
+ @apply focus:outline-none focus:ring-1 focus:ring-ring focus:ring-offset-1;
+ }
+
+ .prompt-modal-icon-box {
+ @apply mx-auto mt-4 flex h-12 w-12 flex-shrink-0 items-center justify-center rounded-full bg-almost-light-blue sm:mx-0 sm:h-10 sm:w-10;
+ }
+ .prompt-modal-icon {
+ @apply h-6 w-6 text-almost-medium-blue;
+ }
+ .prompt-modal-txtarea-arrangement {
+ @apply flex-max-width h-full flex-row items-center justify-center gap-4 overflow-auto bg-accent p-4;
+ }
+ .prompt-modal-txtarea-box {
+ @apply h-full w-full overflow-hidden rounded-lg bg-background px-4 py-5 shadow sm:p-6;
+ }
+ .prompt-modal-txtarea {
+ @apply form-input h-full w-full rounded-lg border-ring;
+ }
+
+ .txtarea-modal-arrangement {
+ @apply flex h-full w-full flex-row items-center justify-center gap-4 bg-input p-4;
+ }
+ .txtarea-modal-box {
+ @apply w-full overflow-hidden rounded-lg bg-background px-4 py-5 shadow sm:p-6;
+ }
+ .txtarea-modal-input {
+ @apply form-input h-full w-full;
+ }
+
+ .api-modal-tabs {
+ @apply flex h-full flex-col overflow-hidden rounded-md border bg-muted text-center sm:w-[75vw] lg:w-full;
+ }
+ .api-modal-tablist-div {
+ @apply flex items-center justify-between px-2;
+ }
+ .api-modal-tabs-content {
+ @apply -mt-1 h-full w-full overflow-hidden px-4 pb-4;
+ }
+ .api-modal-accordion-display {
+ @apply mt-2 flex h-full w-full;
+ }
+ .api-modal-table-arrangement {
+ @apply flex h-fit flex-col gap-5;
+ }
+
+ .icons-parameters-comp {
+ @apply ml-3 h-6 w-6;
+ }
+
+ .form-modal-lock-true {
+ @apply bg-input text-primary;
+ }
+ .form-modal-no-input {
+ @apply bg-input text-center text-primary dark:bg-gray-700 dark:text-gray-300;
+ }
+ .form-modal-lock-false {
+ @apply bg-white text-primary;
+ }
+ .code-highlight {
+ @apply block max-h-[64vh] w-full overflow-y-hidden word-break-break-word border-0 px-3 py-2 text-sm outline-0;
+ }
+
+ .code-nohighlight {
+ @apply block max-h-[70vh] w-full overflow-y-hidden word-break-break-word border-0 px-3 py-2 text-sm outline-0;
+ }
+ .form-modal-lockchat {
+ @apply form-input block w-full rounded-md border-border p-4 pr-16 custom-scroll focus:border-ring focus:ring-ring sm:text-sm;
+ }
+ .form-modal-send-icon-position {
+ @apply absolute bottom-2 right-4;
+ }
+ .form-modal-send-button {
+ @apply rounded-md p-2 px-1 transition-all duration-300;
+ }
+ .form-modal-lock-icon {
+ @apply ml-1 mr-1 h-5 w-5 animate-pulse;
+ }
+ .form-modal-send-icon {
+ @apply mr-2 h-5 w-5 rotate-[44deg];
+ }
+ .form-modal-play-icon {
+ @apply mx-1 h-5 w-5;
+ }
+ .form-modal-chat-position {
+ @apply flex-max-width px-2 py-6 pl-4 pr-9;
+ }
+ .form-modal-chatbot-icon {
+ @apply mb-3 ml-3 mr-6 mt-1;
+ }
+ .form-modal-chat-image {
+ @apply flex flex-col items-center gap-1;
+ }
+ .form-modal-chat-img-box {
+ @apply relative flex h-8 w-8 items-center justify-center overflow-hidden rounded-md p-5 text-2xl;
+ }
+ .form-modal-chat-bot-icon {
+ @apply form-modal-chat-img-box bg-chat-bot-icon;
+ }
+ .form-modal-chat-user-icon {
+ @apply form-modal-chat-img-box bg-chat-user-icon;
+ }
+ .form-modal-chat-icon-img {
+ @apply absolute scale-[60%];
+ }
+ .form-modal-chat-text-position {
+ @apply flex w-full flex-1 text-start;
+ }
+ .form-modal-chat-text {
+ @apply relative flex w-full flex-col text-start text-sm font-normal text-muted-foreground;
+ }
+ .form-modal-chat-icon-div {
+ @apply absolute -left-6 -top-3 cursor-pointer;
+ }
+ .form-modal-chat-icon {
+ @apply h-4 w-4 animate-bounce;
+ }
+ .form-modal-chat-thought-border {
+ @apply rounded-md border border-ring/60;
+ }
+ .form-modal-chat-thought-size {
+ @apply h-full w-[95%];
+ }
+ .form-modal-chat-thought {
+ @apply form-modal-chat-thought-border form-modal-chat-thought-size cursor-pointer overflow-scroll bg-background px-2 py-2 text-start text-primary scrollbar-hide;
+ }
+ .form-modal-markdown-span {
+ @apply mt-1 animate-pulse cursor-default;
+ }
+ .form-modal-initial-prompt-btn {
+ @apply mb-2 flex items-center gap-2 rounded-md border border-border bg-background px-4 py-2 text-sm font-semibold shadow-sm;
+ }
+ .form-modal-iv-box {
+ @apply flex-max-width mt-2 h-[80vh];
+ }
+ .form-modal-iv-size {
+ @apply mr-6 flex h-full w-2/6 flex-col justify-start overflow-auto scrollbar-hide;
+ }
+ .file-component-arrangement {
+ @apply flex items-center py-2;
+ }
+ .file-component-variable {
+ @apply -ml-px mr-1 h-4 w-4 text-primary;
+ }
+ .file-component-variables-span {
+ @apply font-semibold text-primary;
+ }
+ .file-component-variables-title {
+ @apply flex items-center justify-between pt-2;
+ }
+ .file-component-variables-div {
+ @apply mr-2.5 flex items-center;
+ }
+ .file-component-variables-title-txt {
+ @apply text-sm font-medium text-primary;
+ }
+ .file-component-accordion-div {
+ @apply flex items-start gap-3;
+ }
+ .file-component-badge-div {
+ @apply flex-max-width items-center justify-between;
+ }
+ .file-component-tab-column {
+ @apply flex flex-col gap-2 p-1;
+ }
+ .tab-accordion-badge-div {
+ @apply flex flex-1 items-center justify-between py-4 text-sm font-normal text-muted-foreground transition-all;
+ }
+ .eraser-column-arrangement {
+ @apply flex-max-width flex-1 flex-col;
+ }
+ .eraser-size {
+ @apply relative flex h-full w-full flex-col rounded-md border bg-muted;
+ }
+ .eraser-position {
+ @apply absolute right-3 top-3 z-50;
+ }
+ .chat-message-div {
+ @apply flex-max-width h-full flex-col items-center overflow-scroll scrollbar-hide;
+ }
+ .chat-alert-box {
+ @apply flex-max-width h-full flex-col items-center justify-center text-center align-middle;
+ }
+ .langflow-chat-span {
+ @apply text-lg text-foreground;
+ }
+ .langflow-chat-desc {
+ @apply w-2/4 rounded-md border border-border bg-muted px-6 py-8;
+ }
+ .langflow-chat-desc-span {
+ @apply text-base text-muted-foreground;
+ }
+ .langflow-chat-input-div {
+ @apply flex-max-width flex-col items-center justify-between px-8 pb-6;
+ }
+ .langflow-chat-input {
+ @apply relative w-full rounded-md shadow-sm;
+ }
+
+ .tooltip-fixed-width {
+ @apply max-h-[25vh] max-w-[30vw] overflow-auto;
+ }
+
+ .ace-editor-arrangement {
+ @apply flex-max-width h-full flex-col transition-all;
+ }
+ .ace-editor {
+ @apply h-full w-full rounded-lg border-[1px] border-border custom-scroll;
+ }
+ .ace-editor-save-btn {
+ @apply flex-max-width h-fit justify-end;
+ }
+
+ .export-modal-save-api {
+ @apply font-medium leading-none peer-disabled:cursor-not-allowed peer-disabled:opacity-70;
+ }
+
+ .beta-badge-wrapper {
+ @apply absolute right-0 top-0 h-16 w-16 overflow-hidden rounded-tr-lg;
+ }
+ .beta-badge-content {
+ @apply mt-2 w-24 rotate-45 bg-beta-background text-center text-xs font-semibold text-beta-foreground;
+ }
+
+ .chat-message-highlight {
+ @apply rounded-md bg-indigo-100 px-0.5 dark:bg-indigo-900;
+ }
+}
diff --git a/src/frontend/src/style/classes.css b/src/frontend/src/style/classes.css
index 0f9ba3967..b6662e7bf 100644
--- a/src/frontend/src/style/classes.css
+++ b/src/frontend/src/style/classes.css
@@ -1,30 +1,38 @@
body {
- margin: 0;
- font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", "Roboto", "Oxygen",
- "Ubuntu", "Cantarell", "Fira Sans", "Droid Sans", "Helvetica Neue",
- sans-serif;
- -webkit-font-smoothing: antialiased;
- -moz-osx-font-smoothing: grayscale;
+ margin: 0;
+ font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", "Roboto", "Oxygen",
+ "Ubuntu", "Cantarell", "Fira Sans", "Droid Sans", "Helvetica Neue",
+ sans-serif;
+ -webkit-font-smoothing: antialiased;
+ -moz-osx-font-smoothing: grayscale;
}
-
+
code {
- font-family: source-code-pro, Menlo, Monaco, Consolas, "Courier New",
- monospace;
+ font-family: source-code-pro, Menlo, Monaco, Consolas, "Courier New",
+ monospace;
}
pre {
- font-family: inherit;
+ font-family: inherit;
}
.react-flow__pane {
- cursor: default;
+ cursor: default;
}
-
+
.AccordionContent {
- overflow: hidden;
+ overflow: hidden;
}
-.AccordionContent[data-state='open'] {
- animation: slideDown 300ms ease-out;
+.AccordionContent[data-state="open"] {
+ animation: slideDown 300ms ease-out;
+}
+.AccordionContent[data-state="closed"] {
+ animation: slideUp 300ms ease-out;
+}
+
+
+.gradient-end {
+ animation: gradient-motion-end 3s infinite forwards;
+}
+.gradient-start {
+ animation: gradient-motion-start 4s infinite forwards;
}
-.AccordionContent[data-state='closed'] {
- animation: slideUp 300ms ease-out;
-}
\ No newline at end of file
diff --git a/src/frontend/src/style/index.css b/src/frontend/src/style/index.css
index 7e40571b3..e57a89eae 100644
--- a/src/frontend/src/style/index.css
+++ b/src/frontend/src/style/index.css
@@ -2,125 +2,127 @@
@tailwind components;
@tailwind utilities;
-
/* TODO: Confirm that all colors here are found in tailwind config */
@layer base {
-
:root {
- --background: 0 0% 100%; /* hsl(0 0% 100%) */
- --foreground: 222.2 47.4% 11.2%; /* hsl(222 47% 11%) */
- --muted: 210 40% 98%; /* hsl(210 40% 98%) */
- --muted-foreground: 215.4 16.3% 46.9%; /* hsl(215 16% 46%) */
- --popover: 0 0% 100%; /* hsl(0 0% 100%) */
- --popover-foreground: 222.2 47.4% 11.2%; /* hsl(222 47% 11%) */
- --card: 0 0% 100%; /* hsl(0 0% 100%) */
- --card-foreground: 222.2 47.4% 11.2%; /* hsl(222 47% 11%) */
- --border: 214.3 21.8% 91.4%; /* hsl(214 32% 91%) */
- --input: 214.3 21.8% 91.4%; /* hsl(214 32% 91%) */
- --primary: 222.2 27% 11.2%; /* hsl(222 27% 18%) */
- --primary-foreground: 210 40% 98%; /* hsl(210 40% 98%) */
- --secondary: 210 40% 96.1%; /* hsl(210 40% 96%) */
- --secondary-foreground: 222.2 47.4% 11.2%; /* hsl(222 47% 11%) */
- --accent: 210 30% 96.1%; /* hsl(210 30% 96%) */
- --accent-foreground: 222.2 47.4% 11.2%; /* hsl(222 47% 11%) */
- --destructive: 0 100% 50%; /* hsl(0 100% 50%) */
- --destructive-foreground: 210 40% 98%; /* hsl(210 40% 98%) */
- --radius: 0.5rem;
- --ring: 215 20.2% 65.1%; /* hsl(215 20% 65%) */
- --round-btn-shadow: #00000063;
-
- --error-background: #fef2f2;
- --error-foreground: #991b1b;
-
- --success-background: #f0fdf4;
- --success-foreground: #14532d;
+ --background: 0 0% 100%; /* hsl(0 0% 100%) */
+ --foreground: 222.2 47.4% 11.2%; /* hsl(222 47% 11%) */
+ --muted: 210 40% 98%; /* hsl(210 40% 98%) */
+ --muted-foreground: 215.4 16.3% 46.9%; /* hsl(215 16% 46%) */
+ --popover: 0 0% 100%; /* hsl(0 0% 100%) */
+ --popover-foreground: 222.2 47.4% 11.2%; /* hsl(222 47% 11%) */
+ --card: 0 0% 100%; /* hsl(0 0% 100%) */
+ --card-foreground: 222.2 47.4% 11.2%; /* hsl(222 47% 11%) */
+ --border: 214.3 21.8% 91.4%; /* hsl(214 32% 91%) */
+ --input: 214.3 21.8% 91.4%; /* hsl(214 32% 91%) */
+ --primary: 222.2 27% 11.2%; /* hsl(222 27% 18%) */
+ --primary-foreground: 210 40% 98%; /* hsl(210 40% 98%) */
+ --secondary: 210 40% 96.1%; /* hsl(210 40% 96%) */
+ --secondary-foreground: 222.2 47.4% 11.2%; /* hsl(222 47% 11%) */
+ --accent: 210 30% 96.1%; /* hsl(210 30% 96%) */
+ --accent-foreground: 222.2 47.4% 11.2%; /* hsl(222 47% 11%) */
+ --destructive: 0 100% 50%; /* hsl(0 100% 50%) */
+ --destructive-foreground: 210 40% 98%; /* hsl(210 40% 98%) */
+ --radius: 0.5rem;
+ --ring: 215 20.2% 65.1%; /* hsl(215 20% 65%) */
+ --round-btn-shadow: #00000063;
- --info-background: #f0f4fd;
- --info-foreground: #141653;
+ --error-background: #fef2f2;
+ --error-foreground: #991b1b;
- --high-indigo: #4338ca;
- --medium-indigo: #6366f1;
- --low-indigo: #e0e7ff;
+ --success-background: #f0fdf4;
+ --success-foreground: #14532d;
- --chat-bot-icon: #afe6ef;
- --chat-user-icon: #aface9;
-
- /* Colors that are shared in dark and light mode */
- --blur-shared: #151923de;
- --build-trigger: #dc735b;
- --chat-trigger: #5c8be1;
- --chat-trigger-disabled: #b4c3da;
- --status-red: #ef4444;
- --status-yellow: #eab308;
- --chat-send: #059669;
- --status-green: #4ade80;
- --status-blue:#2563eb;
- --connection: #555;
+ --info-background: #f0f4fd;
+ --info-foreground: #141653;
+ --high-indigo: #4338ca;
+ --medium-indigo: #6366f1;
+ --low-indigo: #e0e7ff;
+
+ --beta-background: rgb(219 234 254);
+ --beta-foreground: rgb(37 99 235);
+
+ --chat-bot-icon: #afe6ef;
+ --chat-user-icon: #aface9;
+
+ /* Colors that are shared in dark and light mode */
+ --blur-shared: #151923de;
+ --build-trigger: #dc735b;
+ --chat-trigger: #5c8be1;
+ --chat-trigger-disabled: #b4c3da;
+ --status-red: #ef4444;
+ --status-yellow: #eab308;
+ --chat-send: #059669;
+ --status-green: #4ade80;
+ --status-blue: #2563eb;
+ --connection: #555;
+ }
+
+ .dark {
+ --background: 224 35% 7.5%; /* hsl(224 40% 10%) */
+ --foreground: 213 31% 80%; /* hsl(213 31% 91%) */
+
+ --muted: 223 27% 11%; /* hsl(223 27% 11%) */
+ --muted-foreground: 215.4 16.3% 56.9%; /* hsl(215 16% 56%) */
+
+ --popover: 224 71% 4%; /* hsl(224 71% 4%) */
+ --popover-foreground: 215 20.2% 65.1%; /* hsl(215 20% 65%) */
+
+ --card: 224 25% 15.5%; /* hsl(224 71% 4%) */
+ --card-foreground: 213 31% 80%; /* hsl(213 31% 91%) */
+
+ --border: 216 24% 17%; /* hsl(216 34% 17%) */
+ --input: 216 24% 17%; /* hsl(216 34% 17%) */
+
+ --primary: 210 20% 80%; /* hsl(210 20% 80%) */
+ --primary-foreground: 222.2 27.4% 1.2%; /* hsl(222 47% 1%) */
+
+ --secondary: 222.2 37.4% 7.2%; /* hsl(222 47% 11%) */
+ --secondary-foreground: 210 40% 80%; /* hsl(210 40% 80%) */
+
+ --accent: 216 24% 20%; /* hsl(216 34% 17%) */
+ --accent-foreground: 210 30% 98%; /* hsl(210 40% 98%) */
+
+ --destructive: 0 63% 31%; /* hsl(0 63% 31%) */
+ --destructive-foreground: 210 40% 98%; /* hsl(210 40% 98%) */
+
+ --ring: 216 24% 30%; /* hsl(216 24% 30%) */
+
+ --radius: 0.5rem;
+
+ --round-btn-shadow: #00000063;
+
+ --success-background: #022c22;
+ --success-foreground: #ecfdf5;
+
+ --error-foreground: #fef2f2;
+ --error-background: #450a0a;
+
+ --info-foreground: #eff6ff;
+ --info-background: #172554;
+
+ --high-indigo: #4338ca;
+ --medium-indigo: #6366f1;
+ --low-indigo: #e0e7ff;
+
+ /* Colors that are shared in dark and light mode */
+ --blur-shared: #151923d2;
+ --build-trigger: #dc735b;
+ --chat-trigger: #5c8be1;
+ --chat-trigger-disabled: #2d3b54;
+ --status-red: #ef4444;
+ --status-yellow: #eab308;
+ --chat-send: #059669;
+ --status-green: #4ade80;
+ --status-blue: #2563eb;
+ --connection: #555;
+
+ --beta-background: rgb(37 99 235);
+ --beta-foreground: rgb(219 234 254);
+
+ --chat-bot-icon: #235d70;
+ --chat-user-icon: #4f3d6e;
+ }
}
-
-.dark {
- --background: 224 35% 7.5%; /* hsl(224 40% 10%) */
- --foreground: 213 31% 80%; /* hsl(213 31% 91%) */
-
- --muted: 223 27% 11%; /* hsl(223 27% 11%) */
- --muted-foreground: 215.4 16.3% 56.9%; /* hsl(215 16% 56%) */
-
- --popover: 224 71% 4%; /* hsl(224 71% 4%) */
- --popover-foreground: 215 20.2% 65.1%; /* hsl(215 20% 65%) */
-
- --card: 224 25% 15.5%; /* hsl(224 71% 4%) */
- --card-foreground: 213 31% 80%; /* hsl(213 31% 91%) */
-
- --border: 216 24% 17%; /* hsl(216 34% 17%) */
- --input: 216 24% 17%; /* hsl(216 34% 17%) */
-
- --primary: 210 20% 80%; /* hsl(210 20% 80%) */
- --primary-foreground: 222.2 27.4% 1.2%; /* hsl(222 47% 1%) */
-
- --secondary: 222.2 37.4% 7.2%; /* hsl(222 47% 11%) */
- --secondary-foreground: 210 40% 80%; /* hsl(210 40% 80%) */
-
- --accent: 216 24% 20%; /* hsl(216 34% 17%) */
- --accent-foreground: 210 30% 98%; /* hsl(210 40% 98%) */
-
- --destructive: 0 63% 31%; /* hsl(0 63% 31%) */
- --destructive-foreground: 210 40% 98%; /* hsl(210 40% 98%) */
-
- --ring: 216 24% 30%; /* hsl(216 24% 30%) */
-
- --radius: 0.5rem;
-
- --round-btn-shadow: #00000063;
-
- --success-background: #022c22;
- --success-foreground: #ecfdf5;
-
- --error-foreground: #fef2f2;
- --error-background: #450a0a;
-
- --info-foreground: #eff6ff;
- --info-background: #172554;
-
-
- --high-indigo: #4338ca;
- --medium-indigo: #6366f1;
- --low-indigo: #e0e7ff;
-
- /* Colors that are shared in dark and light mode */
- --blur-shared: #151923d2;
- --build-trigger: #dc735b;
- --chat-trigger: #5c8be1;
- --chat-trigger-disabled: #2d3b54;
- --status-red: #ef4444;
- --status-yellow: #eab308;
- --chat-send: #059669;
- --status-green: #4ade80;
- --status-blue: #2563eb;
- --connection: #555;
-
- --chat-bot-icon: #235d70;
- --chat-user-icon: #4f3d6e;
-
-}}
diff --git a/src/frontend/src/types/api/index.ts b/src/frontend/src/types/api/index.ts
index eaf862f29..3fa848326 100644
--- a/src/frontend/src/types/api/index.ts
+++ b/src/frontend/src/types/api/index.ts
@@ -14,8 +14,10 @@ export type APIClassType = {
display_name: string;
input_types?: Array
;
output_types?: Array;
+ beta?: boolean;
documentation: string;
- [key: string]: Array | string | APITemplateType;
+ error?: string;
+ [key: string]: Array | string | APITemplateType | boolean;
};
export type TemplateVariableType = {
diff --git a/src/frontend/src/types/components/index.ts b/src/frontend/src/types/components/index.ts
index 05311174d..5cf87b8d3 100644
--- a/src/frontend/src/types/components/index.ts
+++ b/src/frontend/src/types/components/index.ts
@@ -62,8 +62,9 @@ export type CodeAreaComponentType = {
onChange: (value: string[] | string) => void;
value: string;
editNode?: boolean;
- nodeClass: APIClassType;
- setNodeClass: (value: APIClassType) => void;
+ nodeClass?: APIClassType;
+ setNodeClass?: (value: APIClassType) => void;
+ dynamic?: boolean;
};
export type FileComponentType = {
diff --git a/src/frontend/src/types/tabs/index.ts b/src/frontend/src/types/tabs/index.ts
index 1a873f651..82934cf35 100644
--- a/src/frontend/src/types/tabs/index.ts
+++ b/src/frontend/src/types/tabs/index.ts
@@ -18,6 +18,8 @@ export type TabsContextType = {
) => void;
downloadFlows: () => void;
uploadFlows: () => void;
+ isBuilt: boolean;
+ setIsBuilt: (state: boolean) => void;
uploadFlow: (newFlow?: boolean, file?: File) => void;
hardReset: () => void;
getNodeId: (nodeType: string) => string;
diff --git a/src/frontend/src/utils/reactflowUtils.ts b/src/frontend/src/utils/reactflowUtils.ts
index 583526167..416ba145f 100644
--- a/src/frontend/src/utils/reactflowUtils.ts
+++ b/src/frontend/src/utils/reactflowUtils.ts
@@ -1,5 +1,5 @@
import _ from "lodash";
-import { Connection, ReactFlowInstance } from "reactflow";
+import { Connection, Edge, ReactFlowInstance } from "reactflow";
import { APITemplateType } from "../types/api";
import { FlowType, NodeType } from "../types/flow";
import { cleanEdgesType } from "../types/utils/reactflowUtils";
@@ -15,7 +15,7 @@ export function cleanEdges({
const sourceNode = nodes.find((node) => node.id === edge.source);
const targetNode = nodes.find((node) => node.id === edge.target);
if (!sourceNode || !targetNode) {
- newEdges = newEdges.filter((e) => e.id !== edge.id);
+ newEdges = newEdges.filter((edg) => edg.id !== edge.id);
}
// check if the source and target handle still exists
if (sourceNode && targetNode) {
@@ -41,7 +41,7 @@ export function cleanEdges({
...sourceNode.data.node.base_classes,
].join("|");
if (id !== sourceHandle) {
- newEdges = newEdges.filter((e) => e.id !== edge.id);
+ newEdges = newEdges.filter((edg) => edg.id !== edge.id);
}
}
}
@@ -57,15 +57,15 @@ export function isValidConnection(
targetHandle
.split("|")[0]
.split(";")
- .some((n) => n === sourceHandle.split("|")[0]) ||
+ .some((target) => target === sourceHandle.split("|")[0]) ||
sourceHandle
.split("|")
.slice(2)
- .some((t) =>
+ .some((target) =>
targetHandle
.split("|")[0]
.split(";")
- .some((n) => n === t)
+ .some((n) => n === target)
) ||
targetHandle.split("|")[0] === "str"
) {
@@ -129,35 +129,35 @@ export function updateTemplate(
export function updateIds(newFlow, getNodeId) {
let idsMap = {};
- newFlow.nodes.forEach((n: NodeType) => {
+ newFlow.nodes.forEach((node: NodeType) => {
// Generate a unique node ID
- let newId = getNodeId(n.data.type);
- idsMap[n.id] = newId;
- n.id = newId;
- n.data.id = newId;
+ let newId = getNodeId(node.data.type);
+ idsMap[node.id] = newId;
+ node.id = newId;
+ node.data.id = newId;
// Add the new node to the list of nodes in state
});
- newFlow.edges.forEach((e) => {
- e.source = idsMap[e.source];
- e.target = idsMap[e.target];
- let sourceHandleSplitted = e.sourceHandle.split("|");
- e.sourceHandle =
+ newFlow.edges.forEach((edge) => {
+ edge.source = idsMap[edge.source];
+ edge.target = idsMap[edge.target];
+ let sourceHandleSplitted = edge.sourceHandle.split("|");
+ edge.sourceHandle =
sourceHandleSplitted[0] +
"|" +
- e.source +
+ edge.source +
"|" +
sourceHandleSplitted.slice(2).join("|");
- let targetHandleSplitted = e.targetHandle.split("|");
- e.targetHandle =
- targetHandleSplitted.slice(0, -1).join("|") + "|" + e.target;
- e.id =
+ let targetHandleSplitted = edge.targetHandle.split("|");
+ edge.targetHandle =
+ targetHandleSplitted.slice(0, -1).join("|") + "|" + edge.target;
+ edge.id =
"reactflow__edge-" +
- e.source +
- e.sourceHandle +
+ edge.source +
+ edge.sourceHandle +
"-" +
- e.target +
- e.targetHandle;
+ edge.target +
+ edge.targetHandle;
});
}
@@ -169,10 +169,10 @@ export function buildTweaks(flow) {
}
export function validateNode(
- n: NodeType,
+ node: NodeType,
reactFlowInstance: ReactFlowInstance
): Array {
- if (!n.data?.node?.template || !Object.keys(n.data.node.template)) {
+ if (!node.data?.node?.template || !Object.keys(node.data.node.template)) {
return [
"We've noticed a potential issue with a node in the flow. Please review it and, if necessary, submit a bug report with your exported flow file. Thank you for your help!",
];
@@ -181,7 +181,7 @@ export function validateNode(
const {
type,
node: { template },
- } = n.data;
+ } = node.data;
return Object.keys(template).reduce(
(errors: Array, t) =>
@@ -194,9 +194,9 @@ export function validateNode(
!reactFlowInstance
.getEdges()
.some(
- (e) =>
- e.targetHandle.split("|")[1] === t &&
- e.targetHandle.split("|")[2] === n.id
+ (edge) =>
+ edge.targetHandle.split("|")[1] === t &&
+ edge.targetHandle.split("|")[2] === node.id
)
? [
`${type} is missing ${
@@ -231,4 +231,13 @@ export function addVersionToDuplicates(flow: FlowType, flows: FlowType[]) {
}
return newName;
-}
\ No newline at end of file
+}
+
+export function getConnectedNodes(
+ edge: Edge,
+ nodes: Array
+): Array {
+ const sourceId = edge.source;
+ const targetId = edge.target;
+ return nodes.filter((node) => node.id === targetId || node.id === sourceId);
+}
diff --git a/src/frontend/src/utils/styleUtils.ts b/src/frontend/src/utils/styleUtils.ts
index 112ea6368..5b40e8c05 100644
--- a/src/frontend/src/utils/styleUtils.ts
+++ b/src/frontend/src/utils/styleUtils.ts
@@ -14,6 +14,7 @@ import {
Cpu,
Download,
DownloadCloud,
+ Edit,
Eraser,
ExternalLink,
File,
@@ -64,7 +65,6 @@ import {
XCircle,
Zap,
} from "lucide-react";
-import { Edge, Node } from "reactflow";
import { AirbyteIcon } from "../icons/Airbyte";
import { AnthropicIcon } from "../icons/Anthropic";
import { BingIcon } from "../icons/Bing";
@@ -74,6 +74,7 @@ import { EvernoteIcon } from "../icons/Evernote";
import { FBIcon } from "../icons/FacebookMessenger";
import { GitBookIcon } from "../icons/GitBook";
import { GoogleIcon } from "../icons/Google";
+import GradientSparkles from "../icons/GradientSparkles";
import { HuggingFaceIcon } from "../icons/HuggingFace";
import { IFixIcon } from "../icons/IFixIt";
import { MetaIcon } from "../icons/Meta";
@@ -146,6 +147,7 @@ export const nodeColors: { [char: string]: string } = {
str: "#049524",
retrievers: "#e6b25a",
unknown: "#9CA3AF",
+ custom_components: "#ab11ab",
};
export const nodeNames: { [char: string]: string } = {
@@ -166,7 +168,8 @@ export const nodeNames: { [char: string]: string } = {
retrievers: "Retrievers",
utilities: "Utilities",
output_parsers: "Output Parsers",
- unknown: "Unknown",
+ custom_components: "Custom",
+ unknown: "Other",
};
export const nodeIconsLucide = {
@@ -202,6 +205,7 @@ export const nodeIconsLucide = {
SupabaseVectorStore: SupabaseIcon,
VertexAI: VertexAIIcon,
ChatVertexAI: VertexAIIcon,
+ VertexAIEmbeddings: VertexAIIcon,
agents: Rocket,
WikipediaAPIWrapper: SvgWikipedia,
chains: Link,
@@ -224,6 +228,8 @@ export const nodeIconsLucide = {
unknown: HelpCircle,
WikipediaQueryRun: SvgWikipedia,
WolframAlphaQueryRun: SvgWolfram,
+ custom_components: GradientSparkles,
+ custom: Edit,
Trash2,
X,
XCircle,
@@ -271,8 +277,3 @@ export const nodeIconsLucide = {
MessageSquare,
MoreHorizontal,
};
-export function getConnectedNodes(edge: Edge, nodes: Array): Array {
- const sourceId = edge.source;
- const targetId = edge.target;
- return nodes.filter((node) => node.id === targetId || node.id === sourceId);
-}
diff --git a/src/frontend/src/utils/utils.ts b/src/frontend/src/utils/utils.ts
index 962d1de48..dec07fbe7 100644
--- a/src/frontend/src/utils/utils.ts
+++ b/src/frontend/src/utils/utils.ts
@@ -2,7 +2,7 @@ import clsx, { ClassValue } from "clsx";
import { twMerge } from "tailwind-merge";
import { ADJECTIVES, DESCRIPTIONS, NOUNS } from "../flow_constants";
import { IVarHighlightType } from "../types/components";
-import { FlowType } from "../types/flow";
+import { FlowType, NodeType } from "../types/flow";
import { TabsState } from "../types/tabs";
import { buildTweaks } from "./reactflowUtils";
@@ -88,119 +88,93 @@ export function checkUpperWords(str: string) {
export const isWrappedWithClass = (event: any, className: string | undefined) =>
event.target.closest(`.${className}`);
-export function groupByFamily(data, baseClasses, left, type) {
- let parentOutput: string;
- let arrOfParent: string[] = [];
- let arrOfType: { family: string; type: string; component: string }[] = [];
- let arrOfLength: { length: number; type: string }[] = [];
- let lastType = "";
- Object.keys(data).forEach((d) => {
- Object.keys(data[d]).forEach((n) => {
- try {
- if (
- data[d][n].base_classes.some((r) =>
- baseClasses.split("\n").includes(r)
- )
- ) {
- arrOfParent.push(d);
- }
- if (n === type) {
- parentOutput = d;
- }
+export function groupByFamily(data, baseClasses, left, flow?: NodeType[]) {
+ const baseClassesSet = new Set(baseClasses.split("\n"));
+ let arrOfPossibleInputs = [];
+ let arrOfPossibleOutputs = [];
+ let checkedNodes = new Map();
+ const excludeTypes = new Set([
+ "str",
+ "bool",
+ "float",
+ "code",
+ "prompt",
+ "file",
+ "int",
+ ]);
- if (d !== lastType) {
- arrOfLength.push({
- length: Object.keys(data[d]).length,
- type: d,
- });
+ const checkBaseClass = (template: any) =>
+ template.type &&
+ template.show &&
+ ((!excludeTypes.has(template.type) && baseClassesSet.has(template.type)) ||
+ (template.input_types &&
+ template.input_types.some((inputType) =>
+ baseClassesSet.has(inputType)
+ )));
- lastType = d;
- }
- } catch (e) {
- console.log(e);
- }
- });
- });
+ if (flow) {
+ for (const node of flow) {
+ const nodeData = node.data;
+ const foundNode = checkedNodes.get(nodeData.type);
+ checkedNodes.set(nodeData.type, {
+ hasBaseClassInTemplate:
+ foundNode?.hasBaseClassInTemplate ||
+ Object.values(nodeData.node.template).some(checkBaseClass),
+ hasBaseClassInBaseClasses:
+ foundNode?.hasBaseClassInBaseClasses ||
+ nodeData.node.base_classes.some((baseClass) =>
+ baseClassesSet.has(baseClass)
+ ),
+ });
+ }
+ }
- Object.keys(data).map((d) => {
- Object.keys(data[d]).map((n) => {
- try {
- baseClasses.split("\n").forEach((tol) => {
- data[d][n].base_classes.forEach((data) => {
- if (tol === data) {
- arrOfType.push({
- family: d,
- type: data,
- component: n,
- });
- }
- });
- });
- } catch (e) {
- console.log(e);
- }
- });
- });
+ for (const [d, nodes] of Object.entries(data)) {
+ let tempInputs = [],
+ tempOutputs = [];
- if (left === false) {
- let groupedBy = arrOfType.filter((object, index, self) => {
- const foundIndex = self.findIndex(
- (o) => o.family === object.family && o.type === object.type
- );
- return foundIndex === index;
- });
-
- return groupedBy.reduce((result, item) => {
- const existingGroup = result.find(
- (group) => group.family === item.family
- );
-
- if (existingGroup) {
- existingGroup.type += `, ${item.type}`;
- } else {
- result.push({
- family: item.family,
- type: item.type,
- component: item.component,
- });
+ for (const [n, node] of Object.entries(nodes)) {
+ let foundNode = checkedNodes.get(n);
+ if (!foundNode) {
+ foundNode = {
+ hasBaseClassInTemplate: Object.values(node.template).some(
+ checkBaseClass
+ ),
+ hasBaseClassInBaseClasses: node.base_classes.some((baseClass) =>
+ baseClassesSet.has(baseClass)
+ ),
+ };
+ checkedNodes.set(n, foundNode);
}
- if (left === false) {
- let resFil = result.filter((group) => group.family === parentOutput);
- result = resFil;
- }
-
- return result;
- }, []);
- } else {
- const groupedArray = [];
- const groupedData = {};
-
- arrOfType.forEach((item) => {
- const { family, type, component } = item;
- const key = `${family}-${type}`;
-
- if (!groupedData[key]) {
- groupedData[key] = { family, type, component: [component] };
- } else {
- groupedData[key].component.push(component);
- }
- });
-
- for (const key in groupedData) {
- groupedArray.push(groupedData[key]);
+ if (foundNode.hasBaseClassInTemplate) tempInputs.push(n);
+ if (foundNode.hasBaseClassInBaseClasses) tempOutputs.push(n);
}
- groupedArray.forEach((object, index, self) => {
- const findObj = arrOfLength.find((x) => x.type === object.family);
- if (object.component.length === findObj.length) {
- self[index]["type"] = "";
- } else {
- self[index]["type"] = object.component.join(", ");
- }
- });
- return groupedArray;
+ const totalNodes = Object.keys(nodes).length;
+ if (tempInputs.length)
+ arrOfPossibleInputs.push({
+ category: d,
+ nodes: tempInputs,
+ full: tempInputs.length === totalNodes,
+ });
+ if (tempOutputs.length)
+ arrOfPossibleOutputs.push({
+ category: d,
+ nodes: tempOutputs,
+ full: tempOutputs.length === totalNodes,
+ });
}
+
+ return left
+ ? arrOfPossibleOutputs.map((output) => ({
+ family: output.category,
+ type: output.full ? "" : output.nodes.join(", "),
+ }))
+ : arrOfPossibleInputs.map((input) => ({
+ family: input.category,
+ type: input.full ? "" : input.nodes.join(", "),
+ }));
}
export function buildInputs(tabsState, id) {
@@ -279,6 +253,27 @@ export function buildTweakObject(tweak) {
return tweakString;
}
+/**
+ * Function to get Chat Input Field
+ * @param {FlowType} flow - The current flow.
+ * @param {TabsState} tabsState - The current tabs state.
+ * @returns {string} - The chat input field
+ */
+export function getChatInputField(flow: FlowType, tabsState?: TabsState) {
+ let chat_input_field = "text";
+
+ if (
+ tabsState[flow.id] &&
+ tabsState[flow.id].formKeysData &&
+ tabsState[flow.id].formKeysData.input_keys
+ ) {
+ chat_input_field = Object.keys(
+ tabsState[flow.id].formKeysData.input_keys
+ )[0];
+ }
+ return chat_input_field;
+}
+
/**
* Function to get the python code for the API
* @param {string} flowId - The id of the flow
@@ -364,7 +359,7 @@ export function getCurlCode(
/**
* Function to get the python code for the API
- * @param {string} flowName - The name of the flow
+ * @param {string} flow - The current flow
* @returns {string} - The python code
*/
export function getPythonCode(
@@ -386,3 +381,32 @@ flow = load_flow_from_json("${flowName}.json", tweaks=TWEAKS)
inputs = ${inputs}
flow(inputs)`;
}
+
+/**
+ * Function to get the widget code for the API
+ * @param {string} flow - The current flow.
+ * @returns {string} - The widget code
+ */
+export function getWidgetCode(flow: FlowType, tabsState?: TabsState): string {
+ const flowId = flow.id;
+ const flowName = flow.name;
+ const inputs = buildInputs(tabsState, flow.id);
+ let chat_input_field = getChatInputField(flow, tabsState);
+
+ return `
+
+
+ `;
+}
diff --git a/src/frontend/tailwind.config.js b/src/frontend/tailwind.config.js
index 2b9062c5f..52330ae92 100644
--- a/src/frontend/tailwind.config.js
+++ b/src/frontend/tailwind.config.js
@@ -71,6 +71,8 @@ module.exports = {
"status-yellow": "var(--status-yellow)",
"success-background": "var(--success-background)",
"success-foreground": "var(--success-foreground)",
+ "beta-background": "var(--beta-background)",
+ "beta-foreground": "var(--beta-foreground)",
"chat-bot-icon": "var(--chat-bot-icon)",
"chat-user-icon": "var(--chat-user-icon)",
@@ -153,7 +155,9 @@ module.exports = {
overflow: "hidden",
"text-overflow": "ellipsis",
},
-
+ ".word-break-break-word": {
+ wordBreak: "break-word",
+ },
".arrow-hide": {
"&::-webkit-inner-spin-button": {
"-webkit-appearance": "none",
diff --git a/src/frontend/vite.config.ts b/src/frontend/vite.config.ts
index f051c2eee..d477ce539 100644
--- a/src/frontend/vite.config.ts
+++ b/src/frontend/vite.config.ts
@@ -6,6 +6,9 @@ const apiRoutes = ["^/api/v1/", "/health"];
// Use environment variable to determine the target.
const target = process.env.VITE_PROXY_TARGET || "http://127.0.0.1:7860";
+// Use environment variable to determine the UI server port
+const port = process.env.VITE_PORT || 3000;
+
const proxyTargets = apiRoutes.reduce((proxyObj, route) => {
proxyObj[route] = {
target: target,
@@ -22,7 +25,7 @@ export default defineConfig(() => {
},
plugins: [react(), svgr()],
server: {
- port: 3000,
+ port: port,
proxy: {
...proxyTargets,
},
diff --git a/tests/conftest.py b/tests/conftest.py
index f893533ac..e90d03d0a 100644
--- a/tests/conftest.py
+++ b/tests/conftest.py
@@ -1,6 +1,7 @@
+from contextlib import contextmanager
import json
from pathlib import Path
-from typing import AsyncGenerator
+from typing import AsyncGenerator, TYPE_CHECKING
from langflow.api.v1.flows import get_session
from langflow.graph.graph.base import Graph
@@ -9,6 +10,10 @@ from fastapi.testclient import TestClient
from httpx import AsyncClient
from sqlmodel import SQLModel, Session, create_engine
from sqlmodel.pool import StaticPool
+from typer.testing import CliRunner
+
+if TYPE_CHECKING:
+ from langflow.services.database.manager import DatabaseManager
def pytest_configure():
@@ -93,8 +98,8 @@ def json_flow():
return f.read()
-@pytest.fixture(name="session") #
-def session_fixture(): #
+@pytest.fixture(name="session")
+def session_fixture():
engine = create_engine(
"sqlite://", connect_args={"check_same_thread": False}, poolclass=StaticPool
)
@@ -103,16 +108,50 @@ def session_fixture(): #
yield session
-@pytest.fixture(name="client") #
-def client_fixture(session: Session): #
- def get_session_override(): #
+@pytest.fixture(name="client")
+def client_fixture(session: Session):
+ def get_session_override():
return session
from langflow.main import create_app
app = create_app()
- app.dependency_overrides[get_session] = get_session_override #
+ app.dependency_overrides[get_session] = get_session_override
+ with TestClient(app) as client:
+ yield client
+ app.dependency_overrides.clear()
- yield TestClient(app)
- app.dependency_overrides.clear() #
+
+# @contextmanager
+# def session_getter():
+# try:
+# session = Session(engine)
+# yield session
+# except Exception as e:
+# print("Session rollback because of exception:", e)
+# session.rollback()
+# raise
+# finally:
+# session.close()
+
+
+# create a fixture for session_getter above
+@pytest.fixture(name="session_getter")
+def session_getter_fixture(client):
+ engine = create_engine(
+ "sqlite://", connect_args={"check_same_thread": False}, poolclass=StaticPool
+ )
+ SQLModel.metadata.create_all(engine)
+
+ @contextmanager
+ def blank_session_getter(db_manager: "DatabaseManager"):
+ with Session(db_manager.engine) as session:
+ yield session
+
+ yield blank_session_getter
+
+
+@pytest.fixture
+def runner():
+ return CliRunner()
diff --git a/tests/test_agents_template.py b/tests/test_agents_template.py
index 93f4f8b5b..0b5fb7c3a 100644
--- a/tests/test_agents_template.py
+++ b/tests/test_agents_template.py
@@ -18,6 +18,7 @@ def test_zero_shot_agent(client: TestClient):
assert template["tools"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -32,6 +33,7 @@ def test_zero_shot_agent(client: TestClient):
# Additional assertions for other template variables
assert template["callback_manager"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -44,6 +46,7 @@ def test_zero_shot_agent(client: TestClient):
}
assert template["llm"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -56,6 +59,7 @@ def test_zero_shot_agent(client: TestClient):
}
assert template["output_parser"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -68,6 +72,7 @@ def test_zero_shot_agent(client: TestClient):
}
assert template["input_variables"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -80,6 +85,7 @@ def test_zero_shot_agent(client: TestClient):
}
assert template["prefix"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": True,
@@ -93,6 +99,7 @@ def test_zero_shot_agent(client: TestClient):
}
assert template["suffix"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": True,
@@ -118,6 +125,7 @@ def test_json_agent(client: TestClient):
assert template["toolkit"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -130,6 +138,7 @@ def test_json_agent(client: TestClient):
}
assert template["llm"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -155,6 +164,7 @@ def test_csv_agent(client: TestClient):
assert template["path"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -171,6 +181,7 @@ def test_csv_agent(client: TestClient):
}
assert template["llm"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -196,6 +207,7 @@ def test_initialize_agent(client: TestClient):
assert template["agent"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -217,6 +229,7 @@ def test_initialize_agent(client: TestClient):
}
assert template["memory"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -229,6 +242,7 @@ def test_initialize_agent(client: TestClient):
}
assert template["tools"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -241,6 +255,7 @@ def test_initialize_agent(client: TestClient):
}
assert template["llm"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
diff --git a/tests/test_cache_manager.py b/tests/test_cache_manager.py
index f3e65481e..660512634 100644
--- a/tests/test_cache_manager.py
+++ b/tests/test_cache_manager.py
@@ -2,7 +2,7 @@ from io import StringIO
import pandas as pd
import pytest
-from langflow.cache.manager import CacheManager
+from langflow.services.cache.manager import CacheManager
from PIL import Image
diff --git a/tests/test_chains_template.py b/tests/test_chains_template.py
index e183cb0d0..4339dbe3b 100644
--- a/tests/test_chains_template.py
+++ b/tests/test_chains_template.py
@@ -1,13 +1,12 @@
from fastapi.testclient import TestClient
-from langflow.settings import settings
-def test_chains_settings(client: TestClient):
- response = client.get("api/v1/all")
- assert response.status_code == 200
- json_response = response.json()
- chains = json_response["chains"]
- assert set(chains.keys()) == set(settings.chains)
+# def test_chains_settings(client: TestClient):
+# response = client.get("api/v1/all")
+# assert response.status_code == 200
+# json_response = response.json()
+# chains = json_response["chains"]
+# assert set(chains.keys()) == set(settings.chains)
# Test the ConversationChain object
@@ -29,6 +28,7 @@ def test_conversation_chain(client: TestClient):
template = chain["template"]
assert template["memory"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -41,6 +41,7 @@ def test_conversation_chain(client: TestClient):
}
assert template["verbose"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -53,6 +54,7 @@ def test_conversation_chain(client: TestClient):
}
assert template["llm"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -65,6 +67,7 @@ def test_conversation_chain(client: TestClient):
}
assert template["input_key"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -78,6 +81,7 @@ def test_conversation_chain(client: TestClient):
}
assert template["output_key"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -115,6 +119,7 @@ def test_llm_chain(client: TestClient):
template = chain["template"]
assert template["memory"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -127,6 +132,7 @@ def test_llm_chain(client: TestClient):
}
assert template["verbose"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -140,6 +146,7 @@ def test_llm_chain(client: TestClient):
}
assert template["llm"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -152,6 +159,7 @@ def test_llm_chain(client: TestClient):
}
assert template["output_key"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -182,6 +190,7 @@ def test_llm_checker_chain(client: TestClient):
template = chain["template"]
assert template["llm"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -215,6 +224,7 @@ def test_llm_math_chain(client: TestClient):
template = chain["template"]
assert template["memory"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -227,6 +237,7 @@ def test_llm_math_chain(client: TestClient):
}
assert template["verbose"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -240,6 +251,7 @@ def test_llm_math_chain(client: TestClient):
}
assert template["llm"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -252,6 +264,7 @@ def test_llm_math_chain(client: TestClient):
}
assert template["input_key"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -265,6 +278,7 @@ def test_llm_math_chain(client: TestClient):
}
assert template["output_key"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -306,6 +320,7 @@ def test_series_character_chain(client: TestClient):
assert template["llm"] == {
"required": True,
+ "dynamic": False,
"display_name": "LLM",
"placeholder": "",
"show": True,
@@ -319,6 +334,7 @@ def test_series_character_chain(client: TestClient):
}
assert template["character"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -331,6 +347,7 @@ def test_series_character_chain(client: TestClient):
}
assert template["series"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -372,6 +389,7 @@ def test_mid_journey_prompt_chain(client: TestClient):
assert template["llm"] == {
"required": True,
+ "dynamic": False,
"display_name": "LLM",
"placeholder": "",
"show": True,
@@ -412,6 +430,7 @@ def test_time_travel_guide_chain(client: TestClient):
assert template["llm"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"display_name": "LLM",
"show": True,
@@ -425,6 +444,7 @@ def test_time_travel_guide_chain(client: TestClient):
}
assert template["memory"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
diff --git a/tests/test_cli.py b/tests/test_cli.py
new file mode 100644
index 000000000..408500d7a
--- /dev/null
+++ b/tests/test_cli.py
@@ -0,0 +1,30 @@
+from pathlib import Path
+from tempfile import tempdir
+from langflow.__main__ import app
+import pytest
+
+from langflow.services import utils
+
+
+@pytest.fixture(scope="module")
+def default_settings():
+ return [
+ "--backend-only",
+ "--no-open-browser",
+ ]
+
+
+def test_components_path(runner, client, default_settings):
+ # Create a foldr in the tmp directory
+ temp_dir = Path(tempdir)
+ # create a "components" folder
+ temp_dir = temp_dir / "components"
+ temp_dir.mkdir(exist_ok=True)
+
+ result = runner.invoke(
+ app,
+ ["--components-path", str(temp_dir), *default_settings],
+ )
+ assert result.exit_code == 0, result.stdout
+ settings_manager = utils.get_settings_manager()
+ assert temp_dir in settings_manager.settings.COMPONENTS_PATH
diff --git a/tests/test_creators.py b/tests/test_creators.py
index 5453b57eb..2098e87cd 100644
--- a/tests/test_creators.py
+++ b/tests/test_creators.py
@@ -35,6 +35,7 @@ def test_lang_chain_type_creator_to_dict(
sample_lang_chain_type_creator: LangChainTypeCreator,
):
type_dict = sample_lang_chain_type_creator.to_dict()
+
assert len(type_dict) == 1
assert "test_type" in type_dict
assert "node1" in type_dict["test_type"]
diff --git a/tests/test_custom_component.py b/tests/test_custom_component.py
new file mode 100644
index 000000000..4dc8c9f1a
--- /dev/null
+++ b/tests/test_custom_component.py
@@ -0,0 +1,558 @@
+import ast
+import pytest
+import types
+from uuid import uuid4
+
+
+from fastapi import HTTPException
+from langflow.services.database.models.flow import Flow, FlowCreate
+from langflow.interface.custom.base import CustomComponent
+from langflow.interface.custom.component import (
+ Component,
+ ComponentCodeNullError,
+ ComponentFunctionEntrypointNameNullError,
+)
+from langflow.interface.custom.code_parser import CodeParser, CodeSyntaxError
+
+
+code_default = """
+from langflow import Prompt
+from langflow.interface.custom.custom_component import CustomComponent
+
+from langchain.llms.base import BaseLLM
+from langchain.chains import LLMChain
+from langchain import PromptTemplate
+from langchain.schema import Document
+
+import requests
+
+class YourComponent(CustomComponent):
+ display_name: str = "Your Component"
+ description: str = "Your description"
+ field_config = { "url": { "multiline": True, "required": True } }
+
+ def build(self, url: str, llm: BaseLLM, template: Prompt) -> Document:
+ response = requests.get(url)
+ prompt = PromptTemplate.from_template(template)
+ chain = LLMChain(llm=llm, prompt=prompt)
+ result = chain.run(response.text[:300])
+ return Document(page_content=str(result))
+"""
+
+
+def test_code_parser_init():
+ """
+ Test the initialization of the CodeParser class.
+ """
+ parser = CodeParser(code_default)
+ assert parser.code == code_default
+
+
+def test_code_parser_get_tree():
+ """
+ Test the __get_tree method of the CodeParser class.
+ """
+ parser = CodeParser(code_default)
+ tree = parser._CodeParser__get_tree()
+ assert isinstance(tree, ast.AST)
+
+
+def test_code_parser_syntax_error():
+ """
+ Test the __get_tree method raises the
+ CodeSyntaxError when given incorrect syntax.
+ """
+ code_syntax_error = "zzz import os"
+
+ parser = CodeParser(code_syntax_error)
+ with pytest.raises(CodeSyntaxError):
+ parser._CodeParser__get_tree()
+
+
+def test_component_init():
+ """
+ Test the initialization of the Component class.
+ """
+ component = Component(code=code_default, function_entrypoint_name="build")
+ assert component.code == code_default
+ assert component.function_entrypoint_name == "build"
+
+
+def test_component_get_code_tree():
+ """
+ Test the get_code_tree method of the Component class.
+ """
+ component = Component(code=code_default, function_entrypoint_name="build")
+ tree = component.get_code_tree(component.code)
+ assert "imports" in tree
+
+
+def test_component_code_null_error():
+ """
+ Test the get_function method raises the
+ ComponentCodeNullError when the code is empty.
+ """
+ component = Component(code="", function_entrypoint_name="")
+ with pytest.raises(ComponentCodeNullError):
+ component.get_function()
+
+
+def test_component_function_entrypoint_name_null_error():
+ """
+ Test the get_function method raises the ComponentFunctionEntrypointNameNullError
+ when the function_entrypoint_name is empty.
+ """
+ component = Component(code=code_default, function_entrypoint_name="")
+ with pytest.raises(ComponentFunctionEntrypointNameNullError):
+ component.get_function()
+
+
+def test_custom_component_init():
+ """
+ Test the initialization of the CustomComponent class.
+ """
+ function_entrypoint_name = "build"
+
+ custom_component = CustomComponent(
+ code=code_default, function_entrypoint_name=function_entrypoint_name
+ )
+ assert custom_component.code == code_default
+ assert custom_component.function_entrypoint_name == function_entrypoint_name
+
+
+def test_custom_component_build_template_config():
+ """
+ Test the build_template_config property of the CustomComponent class.
+ """
+ custom_component = CustomComponent(
+ code=code_default, function_entrypoint_name="build"
+ )
+ config = custom_component.build_template_config
+ assert isinstance(config, dict)
+
+
+def test_custom_component_get_function():
+ """
+ Test the get_function property of the CustomComponent class.
+ """
+ custom_component = CustomComponent(
+ code="def build(): pass", function_entrypoint_name="build"
+ )
+ my_function = custom_component.get_function
+ assert isinstance(my_function, types.FunctionType)
+
+
+def test_code_parser_parse_imports_import():
+ """
+ Test the parse_imports method of the CodeParser
+ class with an import statement.
+ """
+ parser = CodeParser(code_default)
+ tree = parser._CodeParser__get_tree()
+ for node in ast.walk(tree):
+ if isinstance(node, ast.Import):
+ parser.parse_imports(node)
+ assert "requests" in parser.data["imports"]
+
+
+def test_code_parser_parse_imports_importfrom():
+ """
+ Test the parse_imports method of the CodeParser
+ class with an import from statement.
+ """
+ parser = CodeParser("from os import path")
+ tree = parser._CodeParser__get_tree()
+ for node in ast.walk(tree):
+ if isinstance(node, ast.ImportFrom):
+ parser.parse_imports(node)
+ assert ("os", "path") in parser.data["imports"]
+
+
+def test_code_parser_parse_functions():
+ """
+ Test the parse_functions method of the CodeParser class.
+ """
+ parser = CodeParser("def test(): pass")
+ tree = parser._CodeParser__get_tree()
+ for node in ast.walk(tree):
+ if isinstance(node, ast.FunctionDef):
+ parser.parse_functions(node)
+ assert len(parser.data["functions"]) == 1
+ assert parser.data["functions"][0]["name"] == "test"
+
+
+def test_code_parser_parse_classes():
+ """
+ Test the parse_classes method of the CodeParser class.
+ """
+ parser = CodeParser("class Test: pass")
+ tree = parser._CodeParser__get_tree()
+ for node in ast.walk(tree):
+ if isinstance(node, ast.ClassDef):
+ parser.parse_classes(node)
+ assert len(parser.data["classes"]) == 1
+ assert parser.data["classes"][0]["name"] == "Test"
+
+
+def test_code_parser_parse_global_vars():
+ """
+ Test the parse_global_vars method of the CodeParser class.
+ """
+ parser = CodeParser("x = 1")
+ tree = parser._CodeParser__get_tree()
+ for node in ast.walk(tree):
+ if isinstance(node, ast.Assign):
+ parser.parse_global_vars(node)
+ assert len(parser.data["global_vars"]) == 1
+ assert parser.data["global_vars"][0]["targets"] == ["x"]
+
+
+def test_component_get_function_valid():
+ """
+ Test the get_function method of the Component
+ class with valid code and function_entrypoint_name.
+ """
+ component = Component(code="def build(): pass", function_entrypoint_name="build")
+ my_function = component.get_function()
+ assert callable(my_function)
+
+
+def test_custom_component_get_function_entrypoint_args():
+ """
+ Test the get_function_entrypoint_args
+ property of the CustomComponent class.
+ """
+ custom_component = CustomComponent(
+ code=code_default, function_entrypoint_name="build"
+ )
+ args = custom_component.get_function_entrypoint_args
+ assert len(args) == 4
+ assert args[0]["name"] == "self"
+ assert args[1]["name"] == "url"
+ assert args[2]["name"] == "llm"
+
+
+def test_custom_component_get_function_entrypoint_return_type():
+ """
+ Test the get_function_entrypoint_return_type
+ property of the CustomComponent class.
+ """
+ custom_component = CustomComponent(
+ code=code_default, function_entrypoint_name="build"
+ )
+ return_type = custom_component.get_function_entrypoint_return_type
+ assert return_type == ["Document"]
+
+
+def test_custom_component_get_main_class_name():
+ """
+ Test the get_main_class_name property of the CustomComponent class.
+ """
+ custom_component = CustomComponent(
+ code=code_default, function_entrypoint_name="build"
+ )
+ class_name = custom_component.get_main_class_name
+ assert class_name == "YourComponent"
+
+
+def test_custom_component_get_function_valid():
+ """
+ Test the get_function property of the CustomComponent
+ class with valid code and function_entrypoint_name.
+ """
+ custom_component = CustomComponent(
+ code="def build(): pass", function_entrypoint_name="build"
+ )
+ my_function = custom_component.get_function
+ assert callable(my_function)
+
+
+def test_code_parser_parse_arg_no_annotation():
+ """
+ Test the parse_arg method of the CodeParser class without an annotation.
+ """
+ parser = CodeParser("")
+ arg = ast.arg(arg="x", annotation=None)
+ result = parser.parse_arg(arg, None)
+ assert result["name"] == "x"
+ assert "type" not in result
+
+
+def test_code_parser_parse_arg_with_annotation():
+ """
+ Test the parse_arg method of the CodeParser class with an annotation.
+ """
+ parser = CodeParser("")
+ arg = ast.arg(arg="x", annotation=ast.Name(id="int", ctx=ast.Load()))
+ result = parser.parse_arg(arg, None)
+ assert result["name"] == "x"
+ assert result["type"] == "int"
+
+
+def test_code_parser_parse_callable_details_no_args():
+ """
+ Test the parse_callable_details method of the
+ CodeParser class with a function with no arguments.
+ """
+ parser = CodeParser("")
+ node = ast.FunctionDef(
+ name="test",
+ args=ast.arguments(
+ args=[], vararg=None, kwonlyargs=[], kw_defaults=[], kwarg=None, defaults=[]
+ ),
+ body=[],
+ decorator_list=[],
+ returns=None,
+ )
+ result = parser.parse_callable_details(node)
+ assert result["name"] == "test"
+ assert len(result["args"]) == 0
+
+
+def test_code_parser_parse_assign():
+ """
+ Test the parse_assign method of the CodeParser class.
+ """
+ parser = CodeParser("")
+ stmt = ast.Assign(targets=[ast.Name(id="x", ctx=ast.Store())], value=ast.Num(n=1))
+ result = parser.parse_assign(stmt)
+ assert result["name"] == "x"
+ assert result["value"] == "1"
+
+
+def test_code_parser_parse_ann_assign():
+ """
+ Test the parse_ann_assign method of the CodeParser class.
+ """
+ parser = CodeParser("")
+ stmt = ast.AnnAssign(
+ target=ast.Name(id="x", ctx=ast.Store()),
+ annotation=ast.Name(id="int", ctx=ast.Load()),
+ value=ast.Num(n=1),
+ simple=1,
+ )
+ result = parser.parse_ann_assign(stmt)
+ assert result["name"] == "x"
+ assert result["value"] == "1"
+ assert result["annotation"] == "int"
+
+
+def test_code_parser_parse_function_def_not_init():
+ """
+ Test the parse_function_def method of the
+ CodeParser class with a function that is not __init__.
+ """
+ parser = CodeParser("")
+ stmt = ast.FunctionDef(
+ name="test",
+ args=ast.arguments(
+ args=[], vararg=None, kwonlyargs=[], kw_defaults=[], kwarg=None, defaults=[]
+ ),
+ body=[],
+ decorator_list=[],
+ returns=None,
+ )
+ result, is_init = parser.parse_function_def(stmt)
+ assert result["name"] == "test"
+ assert not is_init
+
+
+def test_code_parser_parse_function_def_init():
+ """
+ Test the parse_function_def method of the
+ CodeParser class with an __init__ function.
+ """
+ parser = CodeParser("")
+ stmt = ast.FunctionDef(
+ name="__init__",
+ args=ast.arguments(
+ args=[], vararg=None, kwonlyargs=[], kw_defaults=[], kwarg=None, defaults=[]
+ ),
+ body=[],
+ decorator_list=[],
+ returns=None,
+ )
+ result, is_init = parser.parse_function_def(stmt)
+ assert result["name"] == "__init__"
+ assert is_init
+
+
+def test_component_get_code_tree_syntax_error():
+ """
+ Test the get_code_tree method of the Component class
+ raises the CodeSyntaxError when given incorrect syntax.
+ """
+ component = Component(code="import os as", function_entrypoint_name="build")
+ with pytest.raises(CodeSyntaxError):
+ component.get_code_tree(component.code)
+
+
+def test_custom_component_class_template_validation_no_code():
+ """
+ Test the _class_template_validation method of the CustomComponent class
+ raises the HTTPException when the code is None.
+ """
+ custom_component = CustomComponent(code=None, function_entrypoint_name="build")
+ with pytest.raises(HTTPException):
+ custom_component._class_template_validation(custom_component.code)
+
+
+def test_custom_component_get_code_tree_syntax_error():
+ """
+ Test the get_code_tree method of the CustomComponent class
+ raises the CodeSyntaxError when given incorrect syntax.
+ """
+ custom_component = CustomComponent(
+ code="import os as", function_entrypoint_name="build"
+ )
+ with pytest.raises(CodeSyntaxError):
+ custom_component.get_code_tree(custom_component.code)
+
+
+def test_custom_component_get_function_entrypoint_args_no_args():
+ """
+ Test the get_function_entrypoint_args property of
+ the CustomComponent class with a build method with no arguments.
+ """
+ my_code = """
+class MyMainClass(CustomComponent):
+ def build():
+ pass"""
+
+ custom_component = CustomComponent(code=my_code, function_entrypoint_name="build")
+ args = custom_component.get_function_entrypoint_args
+ assert len(args) == 0
+
+
+def test_custom_component_get_function_entrypoint_return_type_no_return_type():
+ """
+ Test the get_function_entrypoint_return_type property of the
+ CustomComponent class with a build method with no return type.
+ """
+ my_code = """
+class MyClass(CustomComponent):
+ def build():
+ pass"""
+
+ custom_component = CustomComponent(code=my_code, function_entrypoint_name="build")
+ return_type = custom_component.get_function_entrypoint_return_type
+ assert return_type == []
+
+
+def test_custom_component_get_main_class_name_no_main_class():
+ """
+ Test the get_main_class_name property of the
+ CustomComponent class when there is no main class.
+ """
+ my_code = """
+def build():
+ pass"""
+
+ custom_component = CustomComponent(code=my_code, function_entrypoint_name="build")
+ class_name = custom_component.get_main_class_name
+ assert class_name == ""
+
+
+def test_custom_component_build_not_implemented():
+ """
+ Test the build method of the CustomComponent
+ class raises the NotImplementedError.
+ """
+ custom_component = CustomComponent(
+ code="def build(): pass", function_entrypoint_name="build"
+ )
+ with pytest.raises(NotImplementedError):
+ custom_component.build()
+
+
+def test_build_config_no_code():
+ component = CustomComponent(code=None)
+
+ assert component.get_function_entrypoint_args == ""
+ assert component.get_function_entrypoint_return_type == []
+
+
+@pytest.fixture
+def component():
+ return CustomComponent(
+ field_config={
+ "fields": {
+ "llm": {"type": "str"},
+ "url": {"type": "str"},
+ "year": {"type": "int"},
+ }
+ }
+ )
+
+
+@pytest.fixture(scope="session")
+def test_flow(db):
+ flow_data = {
+ "nodes": [{"id": "1"}, {"id": "2"}],
+ "edges": [{"source": "1", "target": "2"}],
+ }
+
+ # Create flow
+ flow = FlowCreate(
+ id=uuid4(), name="Test Flow", description="Fixture flow", data=flow_data
+ )
+
+ # Add to database
+ db.add(flow)
+ db.commit()
+
+ yield flow
+
+ # Clean up
+ db.delete(flow)
+ db.commit()
+
+
+@pytest.fixture(scope="session")
+def db(app):
+ # Setup database for tests
+ yield app.db
+
+ # Teardown
+ app.db.drop_all()
+
+
+def test_list_flows_return_type(component, session_getter):
+ flows = component.list_flows(get_session=session_getter)
+ assert isinstance(flows, list)
+
+
+def test_list_flows_flow_objects(component, session_getter):
+ flows = component.list_flows(get_session=session_getter)
+ assert all(isinstance(flow, Flow) for flow in flows)
+
+
+def test_build_config_return_type(component):
+ config = component.build_config()
+ assert isinstance(config, dict)
+
+
+def test_build_config_has_fields(component):
+ config = component.build_config()
+ assert "fields" in config
+
+
+def test_build_config_fields_dict(component):
+ config = component.build_config()
+ assert isinstance(config["fields"], dict)
+
+
+def test_build_config_field_keys(component):
+ config = component.build_config()
+ assert all(isinstance(key, str) for key in config["fields"])
+
+
+def test_build_config_field_values_dict(component):
+ config = component.build_config()
+ assert all(isinstance(value, dict) for value in config["fields"].values())
+
+
+def test_build_config_field_value_keys(component):
+ config = component.build_config()
+ field_values = config["fields"].values()
+ assert all("type" in value for value in field_values)
diff --git a/tests/test_database.py b/tests/test_database.py
index bc512b6b0..52a5daa4c 100644
--- a/tests/test_database.py
+++ b/tests/test_database.py
@@ -5,16 +5,9 @@ from uuid import UUID, uuid4
from sqlalchemy.orm import Session
from fastapi.testclient import TestClient
-from fastapi.encoders import jsonable_encoder
from langflow.api.v1.schemas import FlowListCreate
-from langflow.database.models.flow import Flow, FlowCreate, FlowUpdate
-
-from langflow.database.models.flow_style import (
- FlowStyleCreate,
- FlowStyleRead,
- FlowStyleUpdate,
-)
+from langflow.services.database.models.flow import Flow, FlowCreate, FlowUpdate
@pytest.fixture(scope="module")
@@ -56,33 +49,12 @@ def test_read_flows(client: TestClient, json_flow: str):
assert response.json()["name"] == flow.name
assert response.json()["data"] == flow.data
- flow_style = FlowStyleCreate(color="red", emoji="๐", flow_id=response.json()["id"])
- response = client.post(
- "api/v1/flow_styles/", json=jsonable_encoder(flow_style.dict())
- )
- assert response.status_code == 200
- assert response.json()["color"] == flow_style.color
- assert response.json()["emoji"] == flow_style.emoji
- assert response.json()["flow_id"] == str(flow_style.flow_id)
-
flow = FlowCreate(name="Test Flow", description="description", data=data)
response = client.post("api/v1/flows/", json=flow.dict())
assert response.status_code == 201
assert response.json()["name"] == flow.name
assert response.json()["data"] == flow.data
- # Now we need to create FlowStyle objects for each Flow
- flow_style = FlowStyleCreate(
- color="green", emoji="๐", flow_id=response.json()["id"]
- )
- response = client.post(
- "api/v1/flow_styles/", json=jsonable_encoder(flow_style.dict())
- )
- assert response.status_code == 200
- assert response.json()["color"] == flow_style.color
- assert response.json()["emoji"] == flow_style.emoji
- assert response.json()["flow_id"] == str(flow_style.flow_id)
-
response = client.get("api/v1/flows/")
assert response.status_code == 200
assert len(response.json()) > 0
@@ -97,21 +69,10 @@ def test_read_flow(client: TestClient, json_flow: str):
# turn it into a UUID
flow_id = UUID(flow_id)
- flow_style = FlowStyleCreate(color="green", emoji="๐", flow_id=flow_id)
- response = client.post(
- "api/v1/flow_styles/", json=jsonable_encoder(flow_style.dict())
- )
- assert response.status_code == 200
- response_json = response.json()
- assert response_json["color"] == flow_style.color
- assert response_json["emoji"] == flow_style.emoji
- assert response_json["flow_id"] == str(flow_style.flow_id)
-
response = client.get(f"api/v1/flows/{flow_id}")
assert response.status_code == 200
assert response.json()["name"] == flow.name
assert response.json()["data"] == flow.data
- assert response.json()["style"]["color"] == flow_style.color
def test_update_flow(client: TestClient, json_flow: str):
@@ -275,66 +236,3 @@ def test_read_empty_flows(client: TestClient):
response = client.get("api/v1/flows/")
assert response.status_code == 200
assert len(response.json()) == 0
-
-
-def test_create_flow_style(client: TestClient):
- flow_style = FlowStyleCreate(color="red", emoji="๐ด")
- response = client.post("api/v1/flow_styles/", json=flow_style.dict())
- assert response.status_code == 200
- created_flow_style = FlowStyleRead(**response.json())
- assert created_flow_style.color == flow_style.color
- assert created_flow_style.emoji == flow_style.emoji
-
-
-def test_read_flow_styles(client: TestClient):
- response = client.get("api/v1/flow_styles/")
- assert response.status_code == 200
- flow_styles = [FlowStyleRead(**flow_style) for flow_style in response.json()]
- assert not flow_styles
- # Create test data
- flow_style = FlowStyleCreate(color="red", emoji="๐ด")
- response = client.post("api/v1/flow_styles/", json=flow_style.dict())
- assert response.status_code == 200
- # Check response data
- response = client.get("api/v1/flow_styles/")
- assert response.status_code == 200
- flow_styles = [FlowStyleRead(**flow_style) for flow_style in response.json()]
- assert len(flow_styles) == 1
- assert flow_styles[0].color == flow_style.color
- assert flow_styles[0].emoji == flow_style.emoji
-
-
-def test_read_flow_style(client: TestClient):
- flow_style = FlowStyleCreate(color="red", emoji="๐ด")
- response = client.post("api/v1/flow_styles/", json=flow_style.dict())
- created_flow_style = FlowStyleRead(**response.json())
- response = client.get(f"api/v1/flow_styles/{created_flow_style.id}")
- assert response.status_code == 200
- read_flow_style = FlowStyleRead(**response.json())
- assert read_flow_style == created_flow_style
-
-
-def test_update_flow_style(client: TestClient):
- flow_style = FlowStyleCreate(color="red", emoji="๐ด")
- response = client.post("api/v1/flow_styles/", json=flow_style.dict())
- created_flow_style = FlowStyleRead(**response.json())
- to_update_flow_style = FlowStyleUpdate(color="blue")
- response = client.patch(
- f"api/v1/flow_styles/{created_flow_style.id}", json=to_update_flow_style.dict()
- )
- assert response.status_code == 200
- updated_flow_style = FlowStyleRead(**response.json())
- assert updated_flow_style.color == "blue"
- assert updated_flow_style.emoji == flow_style.emoji
-
-
-def test_delete_flow_style(client: TestClient):
- flow_style = FlowStyleCreate(color="red", emoji="๐ด")
- response = client.post("api/v1/flow_styles/", json=flow_style.dict())
- created_flow_style = FlowStyleRead(**response.json())
- response = client.delete(f"api/v1/flow_styles/{created_flow_style.id}")
- assert response.status_code == 200
- assert response.json() == {"message": "FlowStyle deleted successfully"}
-
- response = client.get(f"api/v1/flow_styles/{created_flow_style.id}")
- assert response.status_code == 404
diff --git a/tests/test_graph.py b/tests/test_graph.py
index 228bbb4d6..f3efe3614 100644
--- a/tests/test_graph.py
+++ b/tests/test_graph.py
@@ -12,7 +12,6 @@ from langflow.graph.vertex.types import (
FileToolVertex,
LLMVertex,
ToolkitVertex,
- WrapperVertex,
)
from langflow.processing.process import get_result_and_thought
from langflow.utils.payload import get_root_node
@@ -292,11 +291,11 @@ def test_file_tool_node_build(openapi_graph):
assert not Path(file_path).exists()
-def test_wrapper_node_build(openapi_graph):
- wrapper_node = get_node_by_type(openapi_graph, WrapperVertex)
- assert wrapper_node is not None
- built_object = wrapper_node.build()
- assert built_object is not None
+# def test_wrapper_node_build(openapi_graph):
+# wrapper_node = get_node_by_type(openapi_graph, WrapperVertex)
+# assert wrapper_node is not None
+# built_object = wrapper_node.build()
+# assert built_object is not None
def test_get_result_and_thought(basic_graph):
diff --git a/tests/test_llms_template.py b/tests/test_llms_template.py
index 7679ba9c0..f1b76e18e 100644
--- a/tests/test_llms_template.py
+++ b/tests/test_llms_template.py
@@ -1,13 +1,14 @@
from fastapi.testclient import TestClient
-from langflow.settings import settings
+from langflow.services.utils import get_settings_manager
def test_llms_settings(client: TestClient):
+ settings_manager = get_settings_manager()
response = client.get("api/v1/all")
assert response.status_code == 200
json_response = response.json()
llms = json_response["llms"]
- assert set(llms.keys()) == set(settings.llms)
+ assert set(llms.keys()) == set(settings_manager.settings.LLMS)
# def test_hugging_face_hub(client: TestClient):
@@ -113,6 +114,7 @@ def test_openai(client: TestClient):
assert template["cache"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -125,6 +127,7 @@ def test_openai(client: TestClient):
}
assert template["verbose"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -137,6 +140,7 @@ def test_openai(client: TestClient):
}
assert template["client"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -149,6 +153,7 @@ def test_openai(client: TestClient):
}
assert template["model_name"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -170,6 +175,7 @@ def test_openai(client: TestClient):
# Add more assertions for other properties here
assert template["temperature"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -183,6 +189,7 @@ def test_openai(client: TestClient):
}
assert template["max_tokens"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -196,6 +203,7 @@ def test_openai(client: TestClient):
}
assert template["top_p"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -209,6 +217,7 @@ def test_openai(client: TestClient):
}
assert template["frequency_penalty"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -222,6 +231,7 @@ def test_openai(client: TestClient):
}
assert template["presence_penalty"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -235,6 +245,7 @@ def test_openai(client: TestClient):
}
assert template["n"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -248,6 +259,7 @@ def test_openai(client: TestClient):
}
assert template["best_of"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -261,6 +273,7 @@ def test_openai(client: TestClient):
}
assert template["model_kwargs"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -273,6 +286,7 @@ def test_openai(client: TestClient):
}
assert template["openai_api_key"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -287,6 +301,7 @@ def test_openai(client: TestClient):
}
assert template["batch_size"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -300,6 +315,7 @@ def test_openai(client: TestClient):
}
assert template["request_timeout"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -312,6 +328,7 @@ def test_openai(client: TestClient):
}
assert template["logit_bias"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -324,6 +341,7 @@ def test_openai(client: TestClient):
}
assert template["max_retries"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -337,6 +355,7 @@ def test_openai(client: TestClient):
}
assert template["streaming"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -361,6 +380,7 @@ def test_chat_open_ai(client: TestClient):
assert template["verbose"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -374,6 +394,7 @@ def test_chat_open_ai(client: TestClient):
}
assert template["client"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -386,6 +407,7 @@ def test_chat_open_ai(client: TestClient):
}
assert template["model_name"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -409,6 +431,7 @@ def test_chat_open_ai(client: TestClient):
}
assert template["temperature"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -422,6 +445,7 @@ def test_chat_open_ai(client: TestClient):
}
assert template["model_kwargs"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -434,6 +458,7 @@ def test_chat_open_ai(client: TestClient):
}
assert template["openai_api_key"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
@@ -448,6 +473,7 @@ def test_chat_open_ai(client: TestClient):
}
assert template["request_timeout"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -460,6 +486,7 @@ def test_chat_open_ai(client: TestClient):
}
assert template["max_retries"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -473,6 +500,7 @@ def test_chat_open_ai(client: TestClient):
}
assert template["streaming"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -486,6 +514,7 @@ def test_chat_open_ai(client: TestClient):
}
assert template["n"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -500,6 +529,7 @@ def test_chat_open_ai(client: TestClient):
assert template["max_tokens"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": False,
diff --git a/tests/test_prompts_template.py b/tests/test_prompts_template.py
index 5486f3034..dde313c20 100644
--- a/tests/test_prompts_template.py
+++ b/tests/test_prompts_template.py
@@ -1,13 +1,14 @@
from fastapi.testclient import TestClient
-from langflow.settings import settings
+from langflow.services.utils import get_settings_manager
def test_prompts_settings(client: TestClient):
+ settings_manager = get_settings_manager()
response = client.get("api/v1/all")
assert response.status_code == 200
json_response = response.json()
prompts = json_response["prompts"]
- assert set(prompts.keys()) == set(settings.prompts)
+ assert set(prompts.keys()) == set(settings_manager.settings.PROMPTS)
def test_prompt_template(client: TestClient):
@@ -20,6 +21,7 @@ def test_prompt_template(client: TestClient):
template = prompt["template"]
assert template["input_variables"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -30,8 +32,10 @@ def test_prompt_template(client: TestClient):
"advanced": False,
"info": "",
}
+
assert template["output_parser"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -42,8 +46,10 @@ def test_prompt_template(client: TestClient):
"advanced": False,
"info": "",
}
+
assert template["partial_variables"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -54,8 +60,10 @@ def test_prompt_template(client: TestClient):
"advanced": False,
"info": "",
}
+
assert template["template"] == {
"required": True,
+ "dynamic": False,
"placeholder": "",
"show": True,
"multiline": True,
@@ -66,8 +74,10 @@ def test_prompt_template(client: TestClient):
"advanced": False,
"info": "",
}
+
assert template["template_format"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
@@ -79,8 +89,10 @@ def test_prompt_template(client: TestClient):
"advanced": False,
"info": "",
}
+
assert template["validate_template"] == {
"required": False,
+ "dynamic": False,
"placeholder": "",
"show": False,
"multiline": False,
diff --git a/tests/test_vectorstore_template.py b/tests/test_vectorstore_template.py
index 0aa823786..4baa7f4b6 100644
--- a/tests/test_vectorstore_template.py
+++ b/tests/test_vectorstore_template.py
@@ -1,12 +1,14 @@
from fastapi.testclient import TestClient
-from langflow.settings import settings
+from langflow.services.utils import get_settings_manager
# check that all agents are in settings.agents
# are in json_response["agents"]
def test_vectorstores_settings(client: TestClient):
+ settings_manager = get_settings_manager()
response = client.get("api/v1/all")
assert response.status_code == 200
json_response = response.json()
vectorstores = json_response["vectorstores"]
- assert set(vectorstores.keys()) == set(settings.vectorstores)
+ settings_vecs = set(settings_manager.settings.VECTORSTORES)
+ assert all(vs in vectorstores for vs in settings_vecs)
diff --git a/tests/test_websocket.py b/tests/test_websocket.py
index 57a0e95f6..dd668c287 100644
--- a/tests/test_websocket.py
+++ b/tests/test_websocket.py
@@ -1,6 +1,6 @@
from fastapi import WebSocketDisconnect
-# from langflow.chat.manager import ChatManager
+# from langflow.services.chat.manager import ChatManager
import pytest