Merge remote-tracking branch 'origin/dev' into zustand/io/migration
This commit is contained in:
commit
8e6c38f195
64 changed files with 1081 additions and 593 deletions
|
|
@ -98,9 +98,9 @@ Used to load [OpenAI’s](https://openai.com/) embedding models.
|
|||
|
||||
Wrapper around [Google Vertex AI](https://cloud.google.com/vertex-ai) [Embeddings API](https://cloud.google.com/vertex-ai/docs/generative-ai/embeddings/get-text-embeddings).
|
||||
|
||||
:::info
|
||||
<Admonition type="info">
|
||||
Vertex AI is a cloud computing platform offered by Google Cloud Platform (GCP). It provides access, management, and development of applications and services through global data centers. To use Vertex AI PaLM, you need to have the [google-cloud-aiplatform](https://pypi.org/project/google-cloud-aiplatform/) Python package installed and credentials configured for your environment.
|
||||
:::
|
||||
</Admonition>
|
||||
|
||||
- **credentials:** The default custom credentials (google.auth.credentials.Credentials) to use.
|
||||
- **location:** The default location to use when making API calls – defaults to `us-central1`.
|
||||
|
|
|
|||
|
|
@ -40,9 +40,10 @@ Wrapper around Anthropic's large language model used for chat-based interactions
|
|||
|
||||
The `CTransformers` component provides access to the Transformer models implemented in C/C++ using the [GGML](https://github.com/ggerganov/ggml) library.
|
||||
|
||||
:::info
|
||||
<Admonition type="info">
|
||||
|
||||
Make sure to have the `ctransformers` python package installed. Learn more about installation, supported models, and usage [here](https://github.com/marella/ctransformers).
|
||||
:::
|
||||
</Admonition>
|
||||
|
||||
**config:** Configuration for the Transformer models. Check out [config](https://github.com/marella/ctransformers#config). Defaults to:
|
||||
|
||||
|
|
@ -115,9 +116,9 @@ Wrapper around [Cohere's](https://cohere.com) large language models.
|
|||
|
||||
Wrapper around [HuggingFace](https://www.huggingface.co/models) models.
|
||||
|
||||
:::info
|
||||
<Admonition type="info">
|
||||
The HuggingFace Hub is an online platform that hosts over 120k models, 20k datasets, and 50k demo apps, all of which are open-source and publicly available. Discover more at [HuggingFace](http://www.huggingface.co).
|
||||
:::
|
||||
</Admonition>
|
||||
|
||||
- **huggingfacehub_api_token:** Token needed to authenticate the API.
|
||||
- **model_kwargs:** Keyword arguments to pass to the model.
|
||||
|
|
@ -130,9 +131,9 @@ The HuggingFace Hub is an online platform that hosts over 120k models, 20k datas
|
|||
|
||||
The `LlamaCpp` component provides access to the `llama.cpp` models.
|
||||
|
||||
:::info
|
||||
<Admonition type="info">
|
||||
Make sure to have the `llama.cpp` python package installed. Learn more about installation, supported models, and usage [here](https://github.com/ggerganov/llama.cpp).
|
||||
:::
|
||||
</Admonition>
|
||||
|
||||
- **echo:** Whether to echo the prompt – defaults to `False`.
|
||||
- **f16_kv:** Use half-precision for key/value cache – defaults to `True`.
|
||||
|
|
@ -181,9 +182,9 @@ Wrapper around [OpenAI's](https://openai.com) large language models.
|
|||
|
||||
Wrapper around [Google Vertex AI](https://cloud.google.com/vertex-ai) large language models.
|
||||
|
||||
:::info
|
||||
<Admonition type="info">
|
||||
Vertex AI is a cloud computing platform offered by Google Cloud Platform (GCP). It provides access, management, and development of applications and services through global data centers. To use Vertex AI PaLM, you need to have the [google-cloud-aiplatform](https://pypi.org/project/google-cloud-aiplatform/) Python package installed and credentials configured for your environment.
|
||||
:::
|
||||
</Admonition>
|
||||
|
||||
- **credentials:** The default custom credentials (google.auth.credentials.Credentials) to use.
|
||||
- **location:** The default location to use when making API calls – defaults to `us-central1`.
|
||||
|
|
@ -203,9 +204,9 @@ Vertex AI is a cloud computing platform offered by Google Cloud Platform (GCP).
|
|||
|
||||
Wrapper around [Google Vertex AI](https://cloud.google.com/vertex-ai) large language models.
|
||||
|
||||
:::info
|
||||
<Admonition type="info">
|
||||
Vertex AI is a cloud computing platform offered by Google Cloud Platform (GCP). It provides access, management, and development of applications and services through global data centers. To use Vertex AI PaLM, you need to have the [google-cloud-aiplatform](https://pypi.org/project/google-cloud-aiplatform/) Python package installed and credentials configured for your environment.
|
||||
:::
|
||||
</Admonition>
|
||||
|
||||
- **credentials:** The default custom credentials (google.auth.credentials.Credentials) to use.
|
||||
- **location:** The default location to use when making API calls – defaults to `us-central1`.
|
||||
|
|
|
|||
|
|
@ -9,6 +9,21 @@ import Admonition from '@theme/Admonition';
|
|||
</Admonition>
|
||||
|
||||
|
||||
### SearchApi
|
||||
|
||||
Real-time search engine results API. Returns structured JSON data that includes answer box, knowledge graph, organic results, and more.
|
||||
|
||||
**Parameters**
|
||||
|
||||
- **Api Key:** A unique identifier for the SearchApi, necessary for authenticating requests to real-time search engines. This key can be retrieved from the [SearchApi dashboard](https://www.searchapi.io/).
|
||||
- **Engine:** Specifies the search engine. For instance: google, google_scholar, bing, youtube, and youtube_transcripts. A full list of supported engines is available in the [documentation](https://www.searchapi.io/docs/google).
|
||||
- **Parameters:** Allows the selection of any parameters recognized by SearchApi, with some being required and others optional.
|
||||
|
||||
**Output**
|
||||
|
||||
- **Document:** The JSON response from the request as a Document.
|
||||
|
||||
|
||||
### BingSearchRun
|
||||
|
||||
Bing Search is a web search engine owned and operated by Microsoft. It provides search results for various types of content, including web pages, images, videos, and news articles. It uses a combination of algorithms and human editors to deliver search results to users.
|
||||
|
|
@ -60,4 +75,4 @@ Tool for getting metadata about a SQL database. The input to this tool is a comm
|
|||
|
||||
**Params**
|
||||
|
||||
- **Db:** SQLDatabase to query.
|
||||
- **Db:** SQLDatabase to query.
|
||||
|
|
|
|||
|
|
@ -4,9 +4,8 @@
|
|||
|
||||
This guide will help you set up a Langflow development VM in a Google Cloud Platform project using Google Cloud Shell.
|
||||
|
||||
:::note
|
||||
When Cloud Shell opens, be sure to select **Trust repo**. Some `gcloud` commands might not run in an ephemeral Cloud Shell environment.
|
||||
:::
|
||||
> Note: When Cloud Shell opens, be sure to select **Trust repo**. Some `gcloud` commands might not run in an ephemeral Cloud Shell environment.
|
||||
|
||||
|
||||
|
||||
## Standard VM
|
||||
|
|
|
|||
|
|
@ -14,6 +14,7 @@ import ZoomableImage from "/src/theme/ZoomableImage.js";
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: "img/buffer-memory.png",
|
||||
dark: "img/buffer-memory.png",
|
||||
}}
|
||||
/>
|
||||
|
||||
|
|
|
|||
|
|
@ -20,6 +20,7 @@ import ZoomableImage from "/src/theme/ZoomableImage.js";
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: "img/basic-chat.png",
|
||||
dark: "img/basic-chat.png",
|
||||
}}
|
||||
/>
|
||||
|
||||
|
|
|
|||
|
|
@ -32,6 +32,7 @@ import ZoomableImage from "/src/theme/ZoomableImage.js";
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: "img/csv-loader.png",
|
||||
dark: "img/csv-loader.png",
|
||||
}}
|
||||
/>
|
||||
|
||||
|
|
@ -39,12 +40,12 @@ import ZoomableImage from "/src/theme/ZoomableImage.js";
|
|||
|
||||
<Admonition type="note" title="LangChain Components 🦜🔗">
|
||||
|
||||
- [`CSVLoader`](https://python.langchain.com/docs/modules/data_connection/document_loaders/integrations/csv)
|
||||
- [`CSVLoader`](https://python.langchain.com/docs/integrations/document_loaders/csv)
|
||||
- [`CharacterTextSplitter`](https://python.langchain.com/docs/modules/data_connection/document_transformers/text_splitters/character_text_splitter)
|
||||
- [`OpenAIEmbedding`](https://python.langchain.com/docs/modules/data_connection/text_embedding/integrations/openai)
|
||||
- [`Chroma`](https://python.langchain.com/docs/modules/data_connection/vectorstores/integrations/chroma)
|
||||
- [`OpenAIEmbedding`](https://python.langchain.com/docs/integrations/text_embedding/openai)
|
||||
- [`Chroma`](https://python.langchain.com/docs/integrations/vectorstores/chroma)
|
||||
- [`VectorStoreInfo`](https://python.langchain.com/docs/modules/data_connection/vectorstores/)
|
||||
- [`OpenAI`](https://python.langchain.com/docs/modules/model_io/models/llms/integrations/openai)
|
||||
- [`VectorStoreAgent`](https://python.langchain.com/docs/modules/agents/toolkits/vectorstore)
|
||||
- [`VectorStoreAgent`](https://js.langchain.com/docs/modules/agents/tools/how_to/agents_with_vectorstores)
|
||||
|
||||
</Admonition>
|
||||
|
|
|
|||
|
|
@ -14,6 +14,7 @@ The CustomComponent class allows us to create components that interact with Lang
|
|||
alt="Document Processor Component"
|
||||
sources={{
|
||||
light: "img/flow_runner.png",
|
||||
dark: "img/flow_runner.png",
|
||||
}}
|
||||
style={{
|
||||
width: "30%",
|
||||
|
|
@ -339,6 +340,7 @@ Done! This is what our script and custom component looks like:
|
|||
alt="Document Processor Code"
|
||||
sources={{
|
||||
light: "img/flow_runner_code.png",
|
||||
dark: "img/flow_runner_code.png",
|
||||
}}
|
||||
style={{
|
||||
maxWidth: "100%",
|
||||
|
|
@ -353,6 +355,7 @@ Done! This is what our script and custom component looks like:
|
|||
alt="Document Processor Component"
|
||||
sources={{
|
||||
light: "img/flow_runner.png",
|
||||
dark: "img/flow_runner.png",
|
||||
}}
|
||||
style={{
|
||||
width: "40%",
|
||||
|
|
|
|||
|
|
@ -12,6 +12,7 @@ Langflow Examples is a repository on [GitHub](https://github.com/logspace-ai/lan
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: "img/community-examples.png",
|
||||
dark: "img/community-examples.png",
|
||||
}}
|
||||
style={{ width: "100%" }}
|
||||
/>
|
||||
|
|
|
|||
|
|
@ -32,6 +32,7 @@ import ZoomableImage from "/src/theme/ZoomableImage.js";
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: "img/midjourney-prompt-chain.png",
|
||||
dark: "img/midjourney-prompt-chain.png",
|
||||
}}
|
||||
/>
|
||||
|
||||
|
|
@ -40,6 +41,6 @@ import ZoomableImage from "/src/theme/ZoomableImage.js";
|
|||
<Admonition type="note" title="LangChain Components 🦜🔗">
|
||||
|
||||
- [`OpenAI`](https://python.langchain.com/docs/modules/model_io/models/llms/integrations/openai)
|
||||
- [`ConversationSummaryMemory`](https://python.langchain.com/docs/modules/memory/how_to/summary)
|
||||
- [`ConversationSummaryMemory`](https://python.langchain.com/docs/modules/memory/types/summary)
|
||||
|
||||
</Admonition>
|
||||
|
|
|
|||
|
|
@ -24,7 +24,7 @@ https://pt.wikipedia.org/wiki/Harry_Potter
|
|||
|
||||
<Admonition type="info">
|
||||
Learn more about Multiple Vector Stores
|
||||
[here](https://python.langchain.com/docs/modules/agents/toolkits/vectorstore?highlight=Multiple%20Vector%20Stores#multiple-vectorstores).
|
||||
[here](https://python.langchain.com/docs/modules/data_connection/vectorstores/).
|
||||
</Admonition>
|
||||
|
||||
## ⛓️ Langflow Example
|
||||
|
|
@ -37,6 +37,7 @@ import ZoomableImage from "/src/theme/ZoomableImage.js";
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: "img/multiple-vectorstores.png",
|
||||
dark: "img/multiple-vectorstores.png",
|
||||
}}
|
||||
/>
|
||||
|
||||
|
|
@ -44,14 +45,14 @@ import ZoomableImage from "/src/theme/ZoomableImage.js";
|
|||
|
||||
<Admonition type="note" title="LangChain Components 🦜🔗">
|
||||
|
||||
- [`WebBaseLoader`](https://python.langchain.com/docs/modules/data_connection/document_loaders/integrations/web_base)
|
||||
- [`TextLoader`](https://python.langchain.com/docs/modules/data_connection/document_loaders/integrations/unstructured_file)
|
||||
- [`WebBaseLoader`](https://python.langchain.com/docs/integrations/document_loaders/web_base)
|
||||
- [`TextLoader`](https://python.langchain.com/docs/modules/data_connection/document_loaders/)
|
||||
- [`CharacterTextSplitter`](https://python.langchain.com/docs/modules/data_connection/document_transformers/text_splitters/character_text_splitter)
|
||||
- [`OpenAIEmbedding`](https://python.langchain.com/docs/modules/data_connection/text_embedding/integrations/openai)
|
||||
- [`Chroma`](https://python.langchain.com/docs/modules/data_connection/vectorstores/integrations/chroma)
|
||||
- [`OpenAIEmbedding`](https://python.langchain.com/docs/integrations/text_embedding/openai)
|
||||
- [`Chroma`](https://python.langchain.com/docs/integrations/vectorstores/chroma)
|
||||
- [`VectorStoreInfo`](https://python.langchain.com/docs/modules/data_connection/vectorstores/)
|
||||
- [`OpenAI`](https://python.langchain.com/docs/modules/model_io/models/llms/integrations/openai)
|
||||
- [`VectorStoreRouterToolkit`](https://python.langchain.com/docs/modules/agents/toolkits/vectorstore)
|
||||
- [`VectorStoreRouterAgent`](https://python.langchain.com/docs/modules/agents/toolkits/vectorstore)
|
||||
- [`VectorStoreRouterToolkit`](https://js.langchain.com/docs/modules/agents/tools/how_to/agents_with_vectorstores)
|
||||
- [`VectorStoreRouterAgent`](https://js.langchain.com/docs/modules/agents/tools/how_to/agents_with_vectorstores)
|
||||
|
||||
</Admonition>
|
||||
|
|
|
|||
|
|
@ -28,7 +28,7 @@ The `AgentInitializer` component is a quick way to construct an agent from the m
|
|||
<Admonition type="info">
|
||||
The `PythonFunction` is a custom component that uses the LangChain 🦜🔗 tool
|
||||
decorator. Learn more about it
|
||||
[here](https://python.langchain.com/docs/modules/agents/tools/how_to/custom_tools).
|
||||
[here](https://python.langchain.com/docs/modules/agents/tools/custom_tools).
|
||||
</Admonition>
|
||||
|
||||
## ⛓️ Langflow Example
|
||||
|
|
@ -41,6 +41,7 @@ import ZoomableImage from "/src/theme/ZoomableImage.js";
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: "img/python-function.png",
|
||||
dark: "img/python-function.png",
|
||||
}}
|
||||
/>
|
||||
|
||||
|
|
@ -48,7 +49,7 @@ import ZoomableImage from "/src/theme/ZoomableImage.js";
|
|||
|
||||
<Admonition type="note" title="LangChain Components 🦜🔗">
|
||||
|
||||
- [`PythonFunctionTool`](https://python.langchain.com/docs/modules/agents/tools/how_to/custom_tools)
|
||||
- [`PythonFunctionTool`](https://python.langchain.com/docs/modules/agents/tools/custom_tools)
|
||||
- [`ChatOpenAI`](https://python.langchain.com/docs/modules/model_io/models/chat/integrations/openai)
|
||||
- [`AgentInitializer`](https://python.langchain.com/docs/modules/agents/)
|
||||
|
||||
|
|
|
|||
52
docs/docs/examples/searchapi-tool.mdx
Normal file
52
docs/docs/examples/searchapi-tool.mdx
Normal file
|
|
@ -0,0 +1,52 @@
|
|||
import Admonition from "@theme/Admonition";
|
||||
|
||||
# SearchApi Tool
|
||||
|
||||
The [SearchApi](https://www.searchapi.io/) allows developers to retrieve results from search engines such as Google, Google Scholar, YouTube, YouTube transcripts, and more, and can be used as in Langflow through the `SearchApi` tool.
|
||||
|
||||
<Admonition type="info">
|
||||
To use the SearchApi, you must first obtain an API key by registering at [SearchApi's website](https://www.searchapi.io/).
|
||||
</Admonition>
|
||||
|
||||
In the given example, we specify `engine` as `youtube_transcripts` and provide a `video_id`.
|
||||
|
||||
<Admonition type="info">
|
||||
All engines and parameters can be found in [SearchApi documentation](https://www.searchapi.io/docs/google).
|
||||
</Admonition>
|
||||
|
||||
The `RetrievalQA` chain processes a `Document` along with a user's question to return an answer.
|
||||
|
||||
<Admonition type="tip">
|
||||
In this example, we used [`ChatOpenAI`](https://platform.openai.com/) as the
|
||||
LLM, but feel free to experiment with other Language Models!
|
||||
</Admonition>
|
||||
|
||||
The `RetrievalQA` takes `CombineDocsChain` and `SearchApi` tool as inputs, using the tool as a `Document` to answer questions.
|
||||
|
||||
<Admonition type="info">
|
||||
Learn more about the SearchApi
|
||||
[here](https://python.langchain.com/docs/integrations/tools/searchapi).
|
||||
</Admonition>
|
||||
|
||||
## ⛓️ Langflow Example
|
||||
|
||||
import ThemedImage from "@theme/ThemedImage";
|
||||
import useBaseUrl from "@docusaurus/useBaseUrl";
|
||||
import ZoomableImage from "/src/theme/ZoomableImage.js";
|
||||
|
||||
<ZoomableImage
|
||||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: "img/searchapi-tool.png",
|
||||
}}
|
||||
/>
|
||||
|
||||
#### <a target="\_blank" href="json_files/SearchApi_Tool.json" download>Download Flow</a>
|
||||
|
||||
<Admonition type="note" title="LangChain Components 🦜🔗">
|
||||
|
||||
- [`OpenAI`](https://python.langchain.com/docs/modules/model_io/models/llms/integrations/openai)
|
||||
- [`SearchApiAPIWrapper`](https://python.langchain.com/docs/integrations/providers/searchapi#wrappers)
|
||||
- [`ZeroShotAgent`](https://python.langchain.com/docs/modules/agents/how_to/custom_mrkl_agent)
|
||||
|
||||
</Admonition>
|
||||
|
|
@ -22,7 +22,7 @@ The `ZeroShotAgent` takes the `LLMChain` and the `Search` tool as inputs, using
|
|||
|
||||
<Admonition type="info">
|
||||
Learn more about the Serp API
|
||||
[here](https://python.langchain.com/docs/modules/agents/tools/integrations/serpapi).
|
||||
[here](https://python.langchain.com/docs/integrations/providers/serpapi ).
|
||||
</Admonition>
|
||||
|
||||
## ⛓️ Langflow Example
|
||||
|
|
@ -35,6 +35,7 @@ import ZoomableImage from "/src/theme/ZoomableImage.js";
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: "img/serp-api-tool.png",
|
||||
dark: "img/serp-api-tool.png",
|
||||
}}
|
||||
/>
|
||||
|
||||
|
|
@ -45,7 +46,7 @@ import ZoomableImage from "/src/theme/ZoomableImage.js";
|
|||
- [`ZeroShotPrompt`](https://python.langchain.com/docs/modules/model_io/prompts/prompt_templates/)
|
||||
- [`OpenAI`](https://python.langchain.com/docs/modules/model_io/models/llms/integrations/openai)
|
||||
- [`LLMChain`](https://python.langchain.com/docs/modules/chains/foundational/llm_chain)
|
||||
- [`Search`](https://python.langchain.com/docs/modules/agents/tools/integrations/serpapi)
|
||||
- [`Search`](https://python.langchain.com/docs/integrations/providers/serpapi)
|
||||
- [`ZeroShotAgent`](https://python.langchain.com/docs/modules/agents/how_to/custom_mrkl_agent)
|
||||
|
||||
</Admonition>
|
||||
|
|
|
|||
|
|
@ -13,6 +13,7 @@ Creating flows with Langflow is easy. Drag sidebar components onto the canvas an
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: "img/langflow_canvas.png",
|
||||
dark: "img/langflow_canvas.png"
|
||||
}}
|
||||
/>
|
||||
|
||||
|
|
|
|||
|
|
@ -12,6 +12,7 @@ import ZoomableImage from "/src/theme/ZoomableImage.js";
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: "img/hugging-face.png",
|
||||
dark: "img/hugging-face.png",
|
||||
}}
|
||||
style={{ width: "100%" }}
|
||||
/>
|
||||
|
|
|
|||
|
|
@ -17,6 +17,7 @@ Langflow offers an API Key functionality that allows users to access their indiv
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: useBaseUrl("img/api-key.png"),
|
||||
dark: useBaseUrl("img/api-key.png"),
|
||||
}}
|
||||
style={{ width: "50%", maxWidth: "600px", margin: "0 auto" }}
|
||||
/>
|
||||
|
|
|
|||
|
|
@ -13,6 +13,7 @@ Langflow’s chat interface provides a user-friendly experience and functionalit
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: useBaseUrl("img/chat_interface.png"),
|
||||
dark: useBaseUrl("img/chat_interface.png"),
|
||||
}}
|
||||
style={{ width: "100%", maxWidth: "800px", margin: "0 auto" }}
|
||||
/>
|
||||
|
|
@ -25,6 +26,7 @@ Notice that editing variables in the chat interface take place temporarily and w
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: useBaseUrl("img/chat_interface2.png"),
|
||||
dark: useBaseUrl("img/chat_interface2.png"),
|
||||
}}
|
||||
style={{ width: "100%", maxWidth: "800px", margin: "0 auto" }}
|
||||
/>
|
||||
|
|
@ -36,6 +38,7 @@ To view the complete prompt in its original, structured format, click the "Displ
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: useBaseUrl("img/chat_interface3.png"),
|
||||
dark: useBaseUrl("img/chat_interface3.png"),
|
||||
}}
|
||||
style={{ width: "100%", maxWidth: "800px", margin: "0 auto" }}
|
||||
/>
|
||||
|
|
@ -47,6 +50,7 @@ In the chat interface, you can redefine which variable should be interpreted as
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: useBaseUrl("img/chat_interface4.png"),
|
||||
dark: useBaseUrl("img/chat_interface4.png"),
|
||||
}}
|
||||
style={{ width: "100%", maxWidth: "800px", margin: "0 auto" }}
|
||||
/>
|
||||
|
|
|
|||
|
|
@ -38,6 +38,7 @@ import Admonition from "@theme/Admonition";
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: useBaseUrl("img/widget-sidebar.png"),
|
||||
dark: useBaseUrl("img/widget-sidebar.png"),
|
||||
}}
|
||||
style={{ width: "50%", maxWidth: "600px", margin: "0 auto" }}
|
||||
/>
|
||||
|
|
@ -53,6 +54,7 @@ import Admonition from "@theme/Admonition";
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: useBaseUrl("img/widget-code.png"),
|
||||
dark: useBaseUrl("img/widget-code.png"),
|
||||
}}
|
||||
style={{ width: "100%", maxWidth: "800px", margin: "0 auto" }}
|
||||
/>
|
||||
|
|
|
|||
|
|
@ -30,6 +30,7 @@ Components are the building blocks of the flows. They are made of inputs, output
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: useBaseUrl("img/single-compenent.png"),
|
||||
dark: useBaseUrl("img/single-compenent.png"),
|
||||
}}
|
||||
style={{ width: "100%", maxWidth: "800px", margin: "0 auto" }}
|
||||
/>
|
||||
|
|
|
|||
|
|
@ -63,6 +63,7 @@ class DocumentProcessor(CustomComponent):
|
|||
alt="Document Processor Component"
|
||||
sources={{
|
||||
light: "img/document_processor.png",
|
||||
dark: "img/document_processor.png",
|
||||
}}
|
||||
style={{
|
||||
margin: "0 auto",
|
||||
|
|
@ -330,6 +331,7 @@ All done! This is what our script and brand-new custom component look like:
|
|||
alt="Document Processor Code"
|
||||
sources={{
|
||||
light: "img/document_processor_code.png",
|
||||
dark: "img/document_processor_code.png",
|
||||
}}
|
||||
style={{
|
||||
maxWidth: "100%",
|
||||
|
|
@ -344,6 +346,7 @@ All done! This is what our script and brand-new custom component look like:
|
|||
alt="Document Processor Component"
|
||||
sources={{
|
||||
light: "img/document_processor.png",
|
||||
dark: "img/document_processor.png",
|
||||
}}
|
||||
style={{
|
||||
width: "40%",
|
||||
|
|
|
|||
|
|
@ -18,6 +18,7 @@ import Admonition from "@theme/Admonition";
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: useBaseUrl("img/features.png"),
|
||||
dark: useBaseUrl("img/features.png"),
|
||||
}}
|
||||
style={{ width: "100%", maxWidth: "800px", margin: "0 auto" }}
|
||||
/>
|
||||
|
|
|
|||
|
|
@ -86,6 +86,7 @@ With _`LANGFLOW_AUTO_LOGIN`_ set to _`False`_, Langflow requires users to sign u
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: useBaseUrl("img/sign-up.png"),
|
||||
dark: useBaseUrl("img/sign-up.png"),
|
||||
}}
|
||||
style={{ width: "50%", maxWidth: "600px", margin: "0 auto" }}
|
||||
/>
|
||||
|
|
@ -102,6 +103,7 @@ Users can change their profile settings by clicking on the profile icon in the t
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: useBaseUrl("img/my-account.png"),
|
||||
dark: useBaseUrl("img/my-account.png"),
|
||||
}}
|
||||
style={{ width: "50%", maxWidth: "600px", margin: "0 auto" }}
|
||||
/>
|
||||
|
|
@ -112,6 +114,7 @@ By clicking on **Profile Settings**, the user is taken to the profile settings p
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: useBaseUrl("img/profile-settings.png"),
|
||||
dark: useBaseUrl("img/profile-settings.png"),
|
||||
}}
|
||||
style={{ maxWidth: "600px", margin: "0 auto" }}
|
||||
/>
|
||||
|
|
@ -122,6 +125,7 @@ By clicking on **Admin Page**, the superuser is taken to the admin page, where t
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: useBaseUrl("img/admin-page.png"),
|
||||
dark: useBaseUrl("img/admin-page.png"),
|
||||
}}
|
||||
style={{ maxWidth: "600px", margin: "0 auto" }}
|
||||
|
||||
|
|
|
|||
|
|
@ -13,6 +13,7 @@ The prompt template allows users to create prompts and define variables that pro
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: useBaseUrl("img/prompt_customization.png"),
|
||||
dark: useBaseUrl("img/prompt_customization.png"),
|
||||
}}
|
||||
style={{ width: "100%", maxWidth: "800px", margin: "0 auto" }}
|
||||
/>
|
||||
|
|
@ -25,6 +26,7 @@ Variables can be used to define instructions, questions, context, inputs, or exa
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: useBaseUrl("img/prompt_customization2.png"),
|
||||
dark: useBaseUrl("img/prompt_customization2.png"),
|
||||
}}
|
||||
style={{ width: "100%", maxWidth: "800px", margin: "0 auto" }}
|
||||
/>
|
||||
|
|
@ -37,6 +39,7 @@ Once inserted, these variables are immediately recognized as new fields in the p
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: useBaseUrl("img/prompt_customization3.png"),
|
||||
dark: useBaseUrl("img/prompt_customization3.png"),
|
||||
}}
|
||||
style={{ width: "100%", maxWidth: "800px", margin: "0 auto" }}
|
||||
/>
|
||||
|
|
@ -49,6 +52,7 @@ You can also use documents or output parsers as prompt variables. By plugging th
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: useBaseUrl("img/prompt_customization4.png"),
|
||||
dark: useBaseUrl("img/prompt_customization4.png"),
|
||||
}}
|
||||
style={{ width: "100%", maxWidth: "800px", margin: "0 auto" }}
|
||||
/>
|
||||
|
|
@ -63,6 +67,7 @@ If working with an interactive (chat-like) flow, remember to keep one of the inp
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: useBaseUrl("img/prompt_customization5.png"),
|
||||
dark: useBaseUrl("img/prompt_customization5.png"),
|
||||
}}
|
||||
style={{ width: "100%", maxWidth: "800px", margin: "0 auto" }}
|
||||
/>
|
||||
|
|
|
|||
|
|
@ -12,6 +12,7 @@ import ZoomableImage from "/src/theme/ZoomableImage.js";
|
|||
alt="Docusaurus themed image"
|
||||
sources={{
|
||||
light: "img/new_langflow_demo.gif",
|
||||
dark: "img/new_langflow_demo.gif",
|
||||
}}
|
||||
style={{ width: "100%" }}
|
||||
/>
|
||||
|
|
|
|||
|
|
@ -82,6 +82,7 @@ module.exports = {
|
|||
"examples/buffer-memory",
|
||||
"examples/midjourney-prompt-chain",
|
||||
"examples/csv-loader",
|
||||
"examples/searchapi-tool",
|
||||
"examples/serp-api-tool",
|
||||
"examples/multiple-vectorstores",
|
||||
"examples/python-function",
|
||||
|
|
|
|||
BIN
docs/static/img/searchapi-tool.png
vendored
Normal file
BIN
docs/static/img/searchapi-tool.png
vendored
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 654 KiB |
1
docs/static/json_files/SearchApi_Tool.json
vendored
Normal file
1
docs/static/json_files/SearchApi_Tool.json
vendored
Normal file
File diff suppressed because one or more lines are too long
BIN
docs/static/videos/langflow_api.mp4
vendored
BIN
docs/static/videos/langflow_api.mp4
vendored
Binary file not shown.
BIN
docs/static/videos/langflow_build.mp4
vendored
BIN
docs/static/videos/langflow_build.mp4
vendored
Binary file not shown.
BIN
docs/static/videos/langflow_collection.mp4
vendored
BIN
docs/static/videos/langflow_collection.mp4
vendored
Binary file not shown.
BIN
docs/static/videos/langflow_collection_example.mp4
vendored
BIN
docs/static/videos/langflow_collection_example.mp4
vendored
Binary file not shown.
BIN
docs/static/videos/langflow_fork.mp4
vendored
BIN
docs/static/videos/langflow_fork.mp4
vendored
Binary file not shown.
BIN
docs/static/videos/langflow_parameters.mp4
vendored
BIN
docs/static/videos/langflow_parameters.mp4
vendored
Binary file not shown.
BIN
docs/static/videos/langflow_widget.mp4
vendored
BIN
docs/static/videos/langflow_widget.mp4
vendored
Binary file not shown.
480
poetry.lock
generated
480
poetry.lock
generated
|
|
@ -418,17 +418,17 @@ files = [
|
|||
|
||||
[[package]]
|
||||
name = "boto3"
|
||||
version = "1.34.50"
|
||||
version = "1.34.51"
|
||||
description = "The AWS SDK for Python"
|
||||
optional = false
|
||||
python-versions = ">= 3.8"
|
||||
files = [
|
||||
{file = "boto3-1.34.50-py3-none-any.whl", hash = "sha256:8d709365231234bc4f0ca98fdf33a25eeebf78072853c6aa3d259f0f5cf09877"},
|
||||
{file = "boto3-1.34.50.tar.gz", hash = "sha256:290952be7899560039cb0042e8a2354f61a7dead0d0ca8bea6ba901930df0468"},
|
||||
{file = "boto3-1.34.51-py3-none-any.whl", hash = "sha256:67732634dc7d0afda879bd9a5e2d0818a2c14a98bef766b95a3e253ea5104cb9"},
|
||||
{file = "boto3-1.34.51.tar.gz", hash = "sha256:2cd9463e738a184cbce8a6824027c22163c5f73e277a35ff5aa0fb0e845b4301"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
botocore = ">=1.34.50,<1.35.0"
|
||||
botocore = ">=1.34.51,<1.35.0"
|
||||
jmespath = ">=0.7.1,<2.0.0"
|
||||
s3transfer = ">=0.10.0,<0.11.0"
|
||||
|
||||
|
|
@ -437,13 +437,13 @@ crt = ["botocore[crt] (>=1.21.0,<2.0a0)"]
|
|||
|
||||
[[package]]
|
||||
name = "botocore"
|
||||
version = "1.34.50"
|
||||
version = "1.34.51"
|
||||
description = "Low-level, data-driven core of boto 3."
|
||||
optional = false
|
||||
python-versions = ">= 3.8"
|
||||
files = [
|
||||
{file = "botocore-1.34.50-py3-none-any.whl", hash = "sha256:fda510559dbe796eefdb59561cc81be1b99afba3dee53fd23db9a3d587adc0ab"},
|
||||
{file = "botocore-1.34.50.tar.gz", hash = "sha256:33ab82cb96c4bb684f0dbafb071808e4817d83debc88b223e7d988256370c6d7"},
|
||||
{file = "botocore-1.34.51-py3-none-any.whl", hash = "sha256:01d5156247f991b3466a8404e3d7460a9ecbd9b214f9992d6ba797d9ddc6f120"},
|
||||
{file = "botocore-1.34.51.tar.gz", hash = "sha256:5086217442e67dd9de36ec7e87a0c663f76b7790d5fb6a12de565af95e87e319"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -889,13 +889,13 @@ numpy = "*"
|
|||
|
||||
[[package]]
|
||||
name = "chromadb"
|
||||
version = "0.4.23"
|
||||
version = "0.4.24"
|
||||
description = "Chroma."
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "chromadb-0.4.23-py3-none-any.whl", hash = "sha256:3d3c2ffb4ff560721e3daf8c1a3729fd149c551525b6f75543eddb81a4f29e16"},
|
||||
{file = "chromadb-0.4.23.tar.gz", hash = "sha256:54d9a770640704c6cedc15317faab9fd45beb9833e7484c00037e7a8801a349f"},
|
||||
{file = "chromadb-0.4.24-py3-none-any.whl", hash = "sha256:3a08e237a4ad28b5d176685bd22429a03717fe09d35022fb230d516108da01da"},
|
||||
{file = "chromadb-0.4.24.tar.gz", hash = "sha256:a5c80b4e4ad9b236ed2d4899a5b9e8002b489293f2881cb2cadab5b199ee1c72"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -2322,6 +2322,214 @@ files = [
|
|||
google-auth = "*"
|
||||
httplib2 = ">=0.19.0"
|
||||
|
||||
[[package]]
|
||||
name = "google-cloud-aiplatform"
|
||||
version = "1.42.1"
|
||||
description = "Vertex AI API client library"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "google-cloud-aiplatform-1.42.1.tar.gz", hash = "sha256:679068e068e29059d673a6410483fea762286fa07739d684fb1b4626698e0805"},
|
||||
{file = "google_cloud_aiplatform-1.42.1-py2.py3-none-any.whl", hash = "sha256:9f25ebd306807972cf05a578abc16695c4f72d4a2dd7e7b1624dbe247937ba24"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
google-api-core = {version = ">=1.34.1,<2.0.dev0 || >=2.8.dev0,<3.0.0dev", extras = ["grpc"]}
|
||||
google-auth = ">=2.14.1,<3.0.0dev"
|
||||
google-cloud-bigquery = ">=1.15.0,<4.0.0dev"
|
||||
google-cloud-resource-manager = ">=1.3.3,<3.0.0dev"
|
||||
google-cloud-storage = ">=1.32.0,<3.0.0dev"
|
||||
packaging = ">=14.3"
|
||||
proto-plus = ">=1.22.0,<2.0.0dev"
|
||||
protobuf = ">=3.19.5,<3.20.0 || >3.20.0,<3.20.1 || >3.20.1,<4.21.0 || >4.21.0,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<5.0.0dev"
|
||||
shapely = "<3.0.0dev"
|
||||
|
||||
[package.extras]
|
||||
autologging = ["mlflow (>=1.27.0,<=2.1.1)"]
|
||||
cloud-profiler = ["tensorboard-plugin-profile (>=2.4.0,<3.0.0dev)", "tensorflow (>=2.4.0,<3.0.0dev)", "werkzeug (>=2.0.0,<2.1.0dev)"]
|
||||
datasets = ["pyarrow (>=10.0.1)", "pyarrow (>=3.0.0,<8.0dev)"]
|
||||
endpoint = ["requests (>=2.28.1)"]
|
||||
full = ["cloudpickle (<3.0)", "docker (>=5.0.3)", "explainable-ai-sdk (>=1.0.0)", "fastapi (>=0.71.0,<0.103.1)", "google-cloud-bigquery", "google-cloud-bigquery-storage", "google-cloud-logging (<4.0)", "google-vizier (>=0.1.6)", "httpx (>=0.23.0,<0.25.0)", "lit-nlp (==0.4.0)", "mlflow (>=1.27.0,<=2.1.1)", "numpy (>=1.15.0)", "pandas (>=1.0.0)", "pyarrow (>=10.0.1)", "pyarrow (>=3.0.0,<8.0dev)", "pyarrow (>=6.0.1)", "pydantic (<2)", "pyyaml (==5.3.1)", "ray[default] (>=2.4,<2.5)", "ray[default] (>=2.5,<2.5.1)", "requests (>=2.28.1)", "starlette (>=0.17.1)", "tensorflow (>=2.3.0,<2.15.0)", "tensorflow (>=2.3.0,<3.0.0dev)", "urllib3 (>=1.21.1,<1.27)", "uvicorn[standard] (>=0.16.0)"]
|
||||
lit = ["explainable-ai-sdk (>=1.0.0)", "lit-nlp (==0.4.0)", "pandas (>=1.0.0)", "tensorflow (>=2.3.0,<3.0.0dev)"]
|
||||
metadata = ["numpy (>=1.15.0)", "pandas (>=1.0.0)"]
|
||||
pipelines = ["pyyaml (==5.3.1)"]
|
||||
prediction = ["docker (>=5.0.3)", "fastapi (>=0.71.0,<0.103.1)", "httpx (>=0.23.0,<0.25.0)", "starlette (>=0.17.1)", "uvicorn[standard] (>=0.16.0)"]
|
||||
preview = ["cloudpickle (<3.0)", "google-cloud-logging (<4.0)"]
|
||||
private-endpoints = ["requests (>=2.28.1)", "urllib3 (>=1.21.1,<1.27)"]
|
||||
ray = ["google-cloud-bigquery", "google-cloud-bigquery-storage", "pandas (>=1.0.0)", "pyarrow (>=6.0.1)", "pydantic (<2)", "ray[default] (>=2.4,<2.5)", "ray[default] (>=2.5,<2.5.1)"]
|
||||
tensorboard = ["tensorflow (>=2.3.0,<2.15.0)"]
|
||||
testing = ["bigframes", "cloudpickle (<3.0)", "docker (>=5.0.3)", "explainable-ai-sdk (>=1.0.0)", "fastapi (>=0.71.0,<0.103.1)", "google-api-core (>=2.11,<3.0.0)", "google-cloud-bigquery", "google-cloud-bigquery-storage", "google-cloud-logging (<4.0)", "google-vizier (>=0.1.6)", "grpcio-testing", "httpx (>=0.23.0,<0.25.0)", "ipython", "kfp (>=2.6.0,<3.0.0)", "lit-nlp (==0.4.0)", "mlflow (>=1.27.0,<=2.1.1)", "numpy (>=1.15.0)", "pandas (>=1.0.0)", "pyarrow (>=10.0.1)", "pyarrow (>=3.0.0,<8.0dev)", "pyarrow (>=6.0.1)", "pydantic (<2)", "pyfakefs", "pytest-asyncio", "pytest-xdist", "pyyaml (==5.3.1)", "ray[default] (>=2.4,<2.5)", "ray[default] (>=2.5,<2.5.1)", "requests (>=2.28.1)", "requests-toolbelt (<1.0.0)", "scikit-learn", "starlette (>=0.17.1)", "tensorboard-plugin-profile (>=2.4.0,<3.0.0dev)", "tensorflow (>=2.3.0,<2.15.0)", "tensorflow (>=2.3.0,<3.0.0dev)", "tensorflow (>=2.3.0,<=2.12.0)", "tensorflow (>=2.4.0,<3.0.0dev)", "torch (>=2.0.0,<2.1.0)", "urllib3 (>=1.21.1,<1.27)", "uvicorn[standard] (>=0.16.0)", "werkzeug (>=2.0.0,<2.1.0dev)", "xgboost", "xgboost-ray"]
|
||||
vizier = ["google-vizier (>=0.1.6)"]
|
||||
xai = ["tensorflow (>=2.3.0,<3.0.0dev)"]
|
||||
|
||||
[[package]]
|
||||
name = "google-cloud-bigquery"
|
||||
version = "3.17.2"
|
||||
description = "Google BigQuery API client library"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "google-cloud-bigquery-3.17.2.tar.gz", hash = "sha256:6e1cf669a40e567ab3289c7b5f2056363da9fcb85d9a4736ee90240d4a7d84ea"},
|
||||
{file = "google_cloud_bigquery-3.17.2-py2.py3-none-any.whl", hash = "sha256:cdadf5283dca55a1a350bacf8c8a7466169d3cf46c5a0a3abc5e9aa0b0a51dee"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
google-api-core = ">=1.31.5,<2.0.dev0 || >2.3.0,<3.0.0dev"
|
||||
google-cloud-core = ">=1.6.0,<3.0.0dev"
|
||||
google-resumable-media = ">=0.6.0,<3.0dev"
|
||||
packaging = ">=20.0.0"
|
||||
python-dateutil = ">=2.7.2,<3.0dev"
|
||||
requests = ">=2.21.0,<3.0.0dev"
|
||||
|
||||
[package.extras]
|
||||
all = ["Shapely (>=1.8.4,<3.0.0dev)", "db-dtypes (>=0.3.0,<2.0.0dev)", "geopandas (>=0.9.0,<1.0dev)", "google-cloud-bigquery-storage (>=2.6.0,<3.0.0dev)", "grpcio (>=1.47.0,<2.0dev)", "grpcio (>=1.49.1,<2.0dev)", "importlib-metadata (>=1.0.0)", "ipykernel (>=6.0.0)", "ipython (>=7.23.1,!=8.1.0)", "ipywidgets (>=7.7.0)", "opentelemetry-api (>=1.1.0)", "opentelemetry-instrumentation (>=0.20b0)", "opentelemetry-sdk (>=1.1.0)", "pandas (>=1.1.0)", "proto-plus (>=1.15.0,<2.0.0dev)", "protobuf (>=3.19.5,!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev)", "pyarrow (>=3.0.0)", "tqdm (>=4.7.4,<5.0.0dev)"]
|
||||
bigquery-v2 = ["proto-plus (>=1.15.0,<2.0.0dev)", "protobuf (>=3.19.5,!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev)"]
|
||||
bqstorage = ["google-cloud-bigquery-storage (>=2.6.0,<3.0.0dev)", "grpcio (>=1.47.0,<2.0dev)", "grpcio (>=1.49.1,<2.0dev)", "pyarrow (>=3.0.0)"]
|
||||
geopandas = ["Shapely (>=1.8.4,<3.0.0dev)", "geopandas (>=0.9.0,<1.0dev)"]
|
||||
ipython = ["ipykernel (>=6.0.0)", "ipython (>=7.23.1,!=8.1.0)"]
|
||||
ipywidgets = ["ipykernel (>=6.0.0)", "ipywidgets (>=7.7.0)"]
|
||||
opentelemetry = ["opentelemetry-api (>=1.1.0)", "opentelemetry-instrumentation (>=0.20b0)", "opentelemetry-sdk (>=1.1.0)"]
|
||||
pandas = ["db-dtypes (>=0.3.0,<2.0.0dev)", "importlib-metadata (>=1.0.0)", "pandas (>=1.1.0)", "pyarrow (>=3.0.0)"]
|
||||
tqdm = ["tqdm (>=4.7.4,<5.0.0dev)"]
|
||||
|
||||
[[package]]
|
||||
name = "google-cloud-core"
|
||||
version = "2.4.1"
|
||||
description = "Google Cloud API client core library"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "google-cloud-core-2.4.1.tar.gz", hash = "sha256:9b7749272a812bde58fff28868d0c5e2f585b82f37e09a1f6ed2d4d10f134073"},
|
||||
{file = "google_cloud_core-2.4.1-py2.py3-none-any.whl", hash = "sha256:a9e6a4422b9ac5c29f79a0ede9485473338e2ce78d91f2370c01e730eab22e61"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
google-api-core = ">=1.31.6,<2.0.dev0 || >2.3.0,<3.0.0dev"
|
||||
google-auth = ">=1.25.0,<3.0dev"
|
||||
|
||||
[package.extras]
|
||||
grpc = ["grpcio (>=1.38.0,<2.0dev)", "grpcio-status (>=1.38.0,<2.0.dev0)"]
|
||||
|
||||
[[package]]
|
||||
name = "google-cloud-resource-manager"
|
||||
version = "1.12.2"
|
||||
description = "Google Cloud Resource Manager API client library"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "google-cloud-resource-manager-1.12.2.tar.gz", hash = "sha256:2ede446a5087b236f0e1fb39cca3791bae97eb0d9125057401454b190d5572ee"},
|
||||
{file = "google_cloud_resource_manager-1.12.2-py2.py3-none-any.whl", hash = "sha256:45abbb8911195cc831cc77c8e3be84decc271686579b332d4142af507f423ebf"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
google-api-core = {version = ">=1.34.1,<2.0.dev0 || >=2.11.dev0,<3.0.0dev", extras = ["grpc"]}
|
||||
google-auth = ">=2.14.1,<3.0.0dev"
|
||||
grpc-google-iam-v1 = ">=0.12.4,<1.0.0dev"
|
||||
proto-plus = ">=1.22.3,<2.0.0dev"
|
||||
protobuf = ">=3.19.5,<3.20.0 || >3.20.0,<3.20.1 || >3.20.1,<4.21.0 || >4.21.0,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<5.0.0dev"
|
||||
|
||||
[[package]]
|
||||
name = "google-cloud-storage"
|
||||
version = "2.14.0"
|
||||
description = "Google Cloud Storage API client library"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "google-cloud-storage-2.14.0.tar.gz", hash = "sha256:2d23fcf59b55e7b45336729c148bb1c464468c69d5efbaee30f7201dd90eb97e"},
|
||||
{file = "google_cloud_storage-2.14.0-py2.py3-none-any.whl", hash = "sha256:8641243bbf2a2042c16a6399551fbb13f062cbc9a2de38d6c0bb5426962e9dbd"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
google-api-core = ">=1.31.5,<2.0.dev0 || >2.3.0,<3.0.0dev"
|
||||
google-auth = ">=2.23.3,<3.0dev"
|
||||
google-cloud-core = ">=2.3.0,<3.0dev"
|
||||
google-crc32c = ">=1.0,<2.0dev"
|
||||
google-resumable-media = ">=2.6.0"
|
||||
requests = ">=2.18.0,<3.0.0dev"
|
||||
|
||||
[package.extras]
|
||||
protobuf = ["protobuf (<5.0.0dev)"]
|
||||
|
||||
[[package]]
|
||||
name = "google-crc32c"
|
||||
version = "1.5.0"
|
||||
description = "A python wrapper of the C library 'Google CRC32C'"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "google-crc32c-1.5.0.tar.gz", hash = "sha256:89284716bc6a5a415d4eaa11b1726d2d60a0cd12aadf5439828353662ede9dd7"},
|
||||
{file = "google_crc32c-1.5.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:596d1f98fc70232fcb6590c439f43b350cb762fb5d61ce7b0e9db4539654cc13"},
|
||||
{file = "google_crc32c-1.5.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:be82c3c8cfb15b30f36768797a640e800513793d6ae1724aaaafe5bf86f8f346"},
|
||||
{file = "google_crc32c-1.5.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:461665ff58895f508e2866824a47bdee72497b091c730071f2b7575d5762ab65"},
|
||||
{file = "google_crc32c-1.5.0-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e2096eddb4e7c7bdae4bd69ad364e55e07b8316653234a56552d9c988bd2d61b"},
|
||||
{file = "google_crc32c-1.5.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:116a7c3c616dd14a3de8c64a965828b197e5f2d121fedd2f8c5585c547e87b02"},
|
||||
{file = "google_crc32c-1.5.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:5829b792bf5822fd0a6f6eb34c5f81dd074f01d570ed7f36aa101d6fc7a0a6e4"},
|
||||
{file = "google_crc32c-1.5.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:64e52e2b3970bd891309c113b54cf0e4384762c934d5ae56e283f9a0afcd953e"},
|
||||
{file = "google_crc32c-1.5.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:02ebb8bf46c13e36998aeaad1de9b48f4caf545e91d14041270d9dca767b780c"},
|
||||
{file = "google_crc32c-1.5.0-cp310-cp310-win32.whl", hash = "sha256:2e920d506ec85eb4ba50cd4228c2bec05642894d4c73c59b3a2fe20346bd00ee"},
|
||||
{file = "google_crc32c-1.5.0-cp310-cp310-win_amd64.whl", hash = "sha256:07eb3c611ce363c51a933bf6bd7f8e3878a51d124acfc89452a75120bc436289"},
|
||||
{file = "google_crc32c-1.5.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:cae0274952c079886567f3f4f685bcaf5708f0a23a5f5216fdab71f81a6c0273"},
|
||||
{file = "google_crc32c-1.5.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:1034d91442ead5a95b5aaef90dbfaca8633b0247d1e41621d1e9f9db88c36298"},
|
||||
{file = "google_crc32c-1.5.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7c42c70cd1d362284289c6273adda4c6af8039a8ae12dc451dcd61cdabb8ab57"},
|
||||
{file = "google_crc32c-1.5.0-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8485b340a6a9e76c62a7dce3c98e5f102c9219f4cfbf896a00cf48caf078d438"},
|
||||
{file = "google_crc32c-1.5.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:77e2fd3057c9d78e225fa0a2160f96b64a824de17840351b26825b0848022906"},
|
||||
{file = "google_crc32c-1.5.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:f583edb943cf2e09c60441b910d6a20b4d9d626c75a36c8fcac01a6c96c01183"},
|
||||
{file = "google_crc32c-1.5.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:a1fd716e7a01f8e717490fbe2e431d2905ab8aa598b9b12f8d10abebb36b04dd"},
|
||||
{file = "google_crc32c-1.5.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:72218785ce41b9cfd2fc1d6a017dc1ff7acfc4c17d01053265c41a2c0cc39b8c"},
|
||||
{file = "google_crc32c-1.5.0-cp311-cp311-win32.whl", hash = "sha256:66741ef4ee08ea0b2cc3c86916ab66b6aef03768525627fd6a1b34968b4e3709"},
|
||||
{file = "google_crc32c-1.5.0-cp311-cp311-win_amd64.whl", hash = "sha256:ba1eb1843304b1e5537e1fca632fa894d6f6deca8d6389636ee5b4797affb968"},
|
||||
{file = "google_crc32c-1.5.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:98cb4d057f285bd80d8778ebc4fde6b4d509ac3f331758fb1528b733215443ae"},
|
||||
{file = "google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fd8536e902db7e365f49e7d9029283403974ccf29b13fc7028b97e2295b33556"},
|
||||
{file = "google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:19e0a019d2c4dcc5e598cd4a4bc7b008546b0358bd322537c74ad47a5386884f"},
|
||||
{file = "google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:02c65b9817512edc6a4ae7c7e987fea799d2e0ee40c53ec573a692bee24de876"},
|
||||
{file = "google_crc32c-1.5.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:6ac08d24c1f16bd2bf5eca8eaf8304812f44af5cfe5062006ec676e7e1d50afc"},
|
||||
{file = "google_crc32c-1.5.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:3359fc442a743e870f4588fcf5dcbc1bf929df1fad8fb9905cd94e5edb02e84c"},
|
||||
{file = "google_crc32c-1.5.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:1e986b206dae4476f41bcec1faa057851f3889503a70e1bdb2378d406223994a"},
|
||||
{file = "google_crc32c-1.5.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:de06adc872bcd8c2a4e0dc51250e9e65ef2ca91be023b9d13ebd67c2ba552e1e"},
|
||||
{file = "google_crc32c-1.5.0-cp37-cp37m-win32.whl", hash = "sha256:d3515f198eaa2f0ed49f8819d5732d70698c3fa37384146079b3799b97667a94"},
|
||||
{file = "google_crc32c-1.5.0-cp37-cp37m-win_amd64.whl", hash = "sha256:67b741654b851abafb7bc625b6d1cdd520a379074e64b6a128e3b688c3c04740"},
|
||||
{file = "google_crc32c-1.5.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:c02ec1c5856179f171e032a31d6f8bf84e5a75c45c33b2e20a3de353b266ebd8"},
|
||||
{file = "google_crc32c-1.5.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:edfedb64740750e1a3b16152620220f51d58ff1b4abceb339ca92e934775c27a"},
|
||||
{file = "google_crc32c-1.5.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:84e6e8cd997930fc66d5bb4fde61e2b62ba19d62b7abd7a69920406f9ecca946"},
|
||||
{file = "google_crc32c-1.5.0-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:024894d9d3cfbc5943f8f230e23950cd4906b2fe004c72e29b209420a1e6b05a"},
|
||||
{file = "google_crc32c-1.5.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:998679bf62b7fb599d2878aa3ed06b9ce688b8974893e7223c60db155f26bd8d"},
|
||||
{file = "google_crc32c-1.5.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:83c681c526a3439b5cf94f7420471705bbf96262f49a6fe546a6db5f687a3d4a"},
|
||||
{file = "google_crc32c-1.5.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:4c6fdd4fccbec90cc8a01fc00773fcd5fa28db683c116ee3cb35cd5da9ef6c37"},
|
||||
{file = "google_crc32c-1.5.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:5ae44e10a8e3407dbe138984f21e536583f2bba1be9491239f942c2464ac0894"},
|
||||
{file = "google_crc32c-1.5.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:37933ec6e693e51a5b07505bd05de57eee12f3e8c32b07da7e73669398e6630a"},
|
||||
{file = "google_crc32c-1.5.0-cp38-cp38-win32.whl", hash = "sha256:fe70e325aa68fa4b5edf7d1a4b6f691eb04bbccac0ace68e34820d283b5f80d4"},
|
||||
{file = "google_crc32c-1.5.0-cp38-cp38-win_amd64.whl", hash = "sha256:74dea7751d98034887dbd821b7aae3e1d36eda111d6ca36c206c44478035709c"},
|
||||
{file = "google_crc32c-1.5.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c6c777a480337ac14f38564ac88ae82d4cd238bf293f0a22295b66eb89ffced7"},
|
||||
{file = "google_crc32c-1.5.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:759ce4851a4bb15ecabae28f4d2e18983c244eddd767f560165563bf9aefbc8d"},
|
||||
{file = "google_crc32c-1.5.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f13cae8cc389a440def0c8c52057f37359014ccbc9dc1f0827936bcd367c6100"},
|
||||
{file = "google_crc32c-1.5.0-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e560628513ed34759456a416bf86b54b2476c59144a9138165c9a1575801d0d9"},
|
||||
{file = "google_crc32c-1.5.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e1674e4307fa3024fc897ca774e9c7562c957af85df55efe2988ed9056dc4e57"},
|
||||
{file = "google_crc32c-1.5.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:278d2ed7c16cfc075c91378c4f47924c0625f5fc84b2d50d921b18b7975bd210"},
|
||||
{file = "google_crc32c-1.5.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d5280312b9af0976231f9e317c20e4a61cd2f9629b7bfea6a693d1878a264ebd"},
|
||||
{file = "google_crc32c-1.5.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:8b87e1a59c38f275c0e3676fc2ab6d59eccecfd460be267ac360cc31f7bcde96"},
|
||||
{file = "google_crc32c-1.5.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:7c074fece789b5034b9b1404a1f8208fc2d4c6ce9decdd16e8220c5a793e6f61"},
|
||||
{file = "google_crc32c-1.5.0-cp39-cp39-win32.whl", hash = "sha256:7f57f14606cd1dd0f0de396e1e53824c371e9544a822648cd76c034d209b559c"},
|
||||
{file = "google_crc32c-1.5.0-cp39-cp39-win_amd64.whl", hash = "sha256:a2355cba1f4ad8b6988a4ca3feed5bff33f6af2d7f134852cf279c2aebfde541"},
|
||||
{file = "google_crc32c-1.5.0-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:f314013e7dcd5cf45ab1945d92e713eec788166262ae8deb2cfacd53def27325"},
|
||||
{file = "google_crc32c-1.5.0-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3b747a674c20a67343cb61d43fdd9207ce5da6a99f629c6e2541aa0e89215bcd"},
|
||||
{file = "google_crc32c-1.5.0-pp37-pypy37_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8f24ed114432de109aa9fd317278518a5af2d31ac2ea6b952b2f7782b43da091"},
|
||||
{file = "google_crc32c-1.5.0-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b8667b48e7a7ef66afba2c81e1094ef526388d35b873966d8a9a447974ed9178"},
|
||||
{file = "google_crc32c-1.5.0-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:1c7abdac90433b09bad6c43a43af253e688c9cfc1c86d332aed13f9a7c7f65e2"},
|
||||
{file = "google_crc32c-1.5.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:6f998db4e71b645350b9ac28a2167e6632c239963ca9da411523bb439c5c514d"},
|
||||
{file = "google_crc32c-1.5.0-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9c99616c853bb585301df6de07ca2cadad344fd1ada6d62bb30aec05219c45d2"},
|
||||
{file = "google_crc32c-1.5.0-pp38-pypy38_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2ad40e31093a4af319dadf503b2467ccdc8f67c72e4bcba97f8c10cb078207b5"},
|
||||
{file = "google_crc32c-1.5.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cd67cf24a553339d5062eff51013780a00d6f97a39ca062781d06b3a73b15462"},
|
||||
{file = "google_crc32c-1.5.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:398af5e3ba9cf768787eef45c803ff9614cc3e22a5b2f7d7ae116df8b11e3314"},
|
||||
{file = "google_crc32c-1.5.0-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:b1f8133c9a275df5613a451e73f36c2aea4fe13c5c8997e22cf355ebd7bd0728"},
|
||||
{file = "google_crc32c-1.5.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9ba053c5f50430a3fcfd36f75aff9caeba0440b2d076afdb79a318d6ca245f88"},
|
||||
{file = "google_crc32c-1.5.0-pp39-pypy39_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:272d3892a1e1a2dbc39cc5cde96834c236d5327e2122d3aaa19f6614531bb6eb"},
|
||||
{file = "google_crc32c-1.5.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:635f5d4dd18758a1fbd1049a8e8d2fee4ffed124462d837d1a02a0e009c3ab31"},
|
||||
{file = "google_crc32c-1.5.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:c672d99a345849301784604bfeaeba4db0c7aae50b95be04dd651fd2a7310b93"},
|
||||
]
|
||||
|
||||
[package.extras]
|
||||
testing = ["pytest"]
|
||||
|
||||
[[package]]
|
||||
name = "google-generativeai"
|
||||
version = "0.3.2"
|
||||
|
|
@ -2375,13 +2583,13 @@ grpc = ["grpcio (>=1.44.0,<2.0.0.dev0)"]
|
|||
|
||||
[[package]]
|
||||
name = "gotrue"
|
||||
version = "2.1.0"
|
||||
version = "2.4.1"
|
||||
description = "Python Client Library for GoTrue"
|
||||
optional = false
|
||||
python-versions = ">=3.8,<4.0"
|
||||
files = [
|
||||
{file = "gotrue-2.1.0-py3-none-any.whl", hash = "sha256:6483d9a3ac9be1d1ad510be24171e133aa1cec702cc10a8f323b9e7519642447"},
|
||||
{file = "gotrue-2.1.0.tar.gz", hash = "sha256:b21d48ee64f0f6a1ed111efe4871a83e542529f1a75a264833b50e6433cd3c98"},
|
||||
{file = "gotrue-2.4.1-py3-none-any.whl", hash = "sha256:9647bb7a585c969d26667df21168fa20b18f91c5d6afe286af08d7a0610fd2cc"},
|
||||
{file = "gotrue-2.4.1.tar.gz", hash = "sha256:8b260ef285f45a3a2f9b5a006f12afb9fad7a36a28fa277f19e733f22eb88584"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -3332,13 +3540,13 @@ extended-testing = ["aiosqlite (>=0.19.0,<0.20.0)", "aleph-alpha-client (>=2.15.
|
|||
|
||||
[[package]]
|
||||
name = "langchain-core"
|
||||
version = "0.1.26"
|
||||
version = "0.1.27"
|
||||
description = "Building applications with LLMs through composability"
|
||||
optional = false
|
||||
python-versions = ">=3.8.1,<4.0"
|
||||
files = [
|
||||
{file = "langchain_core-0.1.26-py3-none-any.whl", hash = "sha256:4f54cd26c27473172d7a214a5507a4c0e3255c6d8c25d9087afdc967f5588516"},
|
||||
{file = "langchain_core-0.1.26.tar.gz", hash = "sha256:6186758d62015723aac67ef1a2055695d03e82c4dd4074217975b0c62faf4b17"},
|
||||
{file = "langchain_core-0.1.27-py3-none-any.whl", hash = "sha256:68eb89dc4a932baf4fb6b4b75b7119eec9e5405e892d2137e9fe0a1d24a40d0c"},
|
||||
{file = "langchain_core-0.1.27.tar.gz", hash = "sha256:698414223525c0bc130d85a614e1493905d588ab72fe0c9ad3b537b1dc62067f"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -3423,13 +3631,13 @@ six = "*"
|
|||
|
||||
[[package]]
|
||||
name = "langfuse"
|
||||
version = "2.16.2"
|
||||
version = "2.18.2"
|
||||
description = "A client library for accessing langfuse"
|
||||
optional = false
|
||||
python-versions = ">=3.8.1,<3.13"
|
||||
python-versions = ">=3.8.1,<4.0"
|
||||
files = [
|
||||
{file = "langfuse-2.16.2-py3-none-any.whl", hash = "sha256:6aa6911a9fb9cb1a9ffc67c9afb88e741ff95888ca711b03278e49b73a36adb1"},
|
||||
{file = "langfuse-2.16.2.tar.gz", hash = "sha256:17bec8a86497a836a75fd87aeebd8d333800570b5988eae71d72d1d99fe22431"},
|
||||
{file = "langfuse-2.18.2-py3-none-any.whl", hash = "sha256:096f809ead95edb3605dbc2afb0f62727d357d76c587c97c753c86db14b1db24"},
|
||||
{file = "langfuse-2.18.2.tar.gz", hash = "sha256:8aa349a5b923cb5335e9b0602795d01628aeabf81c25544e073d6170c8b8fd6a"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -3443,17 +3651,17 @@ wrapt = "1.14"
|
|||
|
||||
[package.extras]
|
||||
langchain = ["langchain (>=0.0.309)"]
|
||||
llama-index = ["llama-index (>=0.10.6,<0.11.0)"]
|
||||
llama-index = ["llama-index (>=0.10.12,<0.11.0)"]
|
||||
|
||||
[[package]]
|
||||
name = "langsmith"
|
||||
version = "0.1.9"
|
||||
version = "0.1.10"
|
||||
description = "Client library to connect to the LangSmith LLM Tracing and Evaluation Platform."
|
||||
optional = false
|
||||
python-versions = ">=3.8.1,<4.0"
|
||||
files = [
|
||||
{file = "langsmith-0.1.9-py3-none-any.whl", hash = "sha256:f821b3cb07a87eac5cb2181ff0b61051811e4eef09ae4b46e700981f7ae5dfb9"},
|
||||
{file = "langsmith-0.1.9.tar.gz", hash = "sha256:9bd3e80607722c3d2db84cf3440005491a859b80b5e499bc988032d5c2da91f0"},
|
||||
{file = "langsmith-0.1.10-py3-none-any.whl", hash = "sha256:2997a80aea60ed235d83502a7ccdc1f62ffb4dd6b3b7dd4218e8fa4de68a6725"},
|
||||
{file = "langsmith-0.1.10.tar.gz", hash = "sha256:13e7e8b52e694aa4003370cefbb9e79cce3540c65dbf1517902bf7aa4dbbb653"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -3480,12 +3688,12 @@ regex = ["regex"]
|
|||
|
||||
[[package]]
|
||||
name = "llama-cpp-python"
|
||||
version = "0.2.52"
|
||||
version = "0.2.53"
|
||||
description = "Python bindings for the llama.cpp library"
|
||||
optional = true
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "llama_cpp_python-0.2.52.tar.gz", hash = "sha256:cc3f670ea5b315547396b0bbc108fcc9602d19b8af858e03c4c0fae385fb9a04"},
|
||||
{file = "llama_cpp_python-0.2.53.tar.gz", hash = "sha256:f7ff8eda538ca6c80521a8bbf80d3ef4527ecb28f6d08fa9b3bb1f0cfc3b684e"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -3859,84 +4067,99 @@ description = "Powerful and Pythonic XML processing library combining libxml2/li
|
|||
optional = false
|
||||
python-versions = ">=3.6"
|
||||
files = [
|
||||
{file = "lxml-5.1.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:704f5572ff473a5f897745abebc6df40f22d4133c1e0a1f124e4f2bd3330ff7e"},
|
||||
{file = "lxml-5.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:9d3c0f8567ffe7502d969c2c1b809892dc793b5d0665f602aad19895f8d508da"},
|
||||
{file = "lxml-5.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5fcfbebdb0c5d8d18b84118842f31965d59ee3e66996ac842e21f957eb76138c"},
|
||||
{file = "lxml-5.1.0-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2f37c6d7106a9d6f0708d4e164b707037b7380fcd0b04c5bd9cae1fb46a856fb"},
|
||||
{file = "lxml-5.1.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2befa20a13f1a75c751f47e00929fb3433d67eb9923c2c0b364de449121f447c"},
|
||||
{file = "lxml-5.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:22b7ee4c35f374e2c20337a95502057964d7e35b996b1c667b5c65c567d2252a"},
|
||||
{file = "lxml-5.1.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:bf8443781533b8d37b295016a4b53c1494fa9a03573c09ca5104550c138d5c05"},
|
||||
{file = "lxml-5.1.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:82bddf0e72cb2af3cbba7cec1d2fd11fda0de6be8f4492223d4a268713ef2147"},
|
||||
{file = "lxml-5.1.0-cp310-cp310-win32.whl", hash = "sha256:b66aa6357b265670bb574f050ffceefb98549c721cf28351b748be1ef9577d93"},
|
||||
{file = "lxml-5.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:4946e7f59b7b6a9e27bef34422f645e9a368cb2be11bf1ef3cafc39a1f6ba68d"},
|
||||
{file = "lxml-5.1.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:14deca1460b4b0f6b01f1ddc9557704e8b365f55c63070463f6c18619ebf964f"},
|
||||
{file = "lxml-5.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:ed8c3d2cd329bf779b7ed38db176738f3f8be637bb395ce9629fc76f78afe3d4"},
|
||||
{file = "lxml-5.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:436a943c2900bb98123b06437cdd30580a61340fbdb7b28aaf345a459c19046a"},
|
||||
{file = "lxml-5.1.0-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:acb6b2f96f60f70e7f34efe0c3ea34ca63f19ca63ce90019c6cbca6b676e81fa"},
|
||||
{file = "lxml-5.1.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:af8920ce4a55ff41167ddbc20077f5698c2e710ad3353d32a07d3264f3a2021e"},
|
||||
{file = "lxml-5.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7cfced4a069003d8913408e10ca8ed092c49a7f6cefee9bb74b6b3e860683b45"},
|
||||
{file = "lxml-5.1.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:9e5ac3437746189a9b4121db2a7b86056ac8786b12e88838696899328fc44bb2"},
|
||||
{file = "lxml-5.1.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:f4c9bda132ad108b387c33fabfea47866af87f4ea6ffb79418004f0521e63204"},
|
||||
{file = "lxml-5.1.0-cp311-cp311-win32.whl", hash = "sha256:bc64d1b1dab08f679fb89c368f4c05693f58a9faf744c4d390d7ed1d8223869b"},
|
||||
{file = "lxml-5.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:a5ab722ae5a873d8dcee1f5f45ddd93c34210aed44ff2dc643b5025981908cda"},
|
||||
{file = "lxml-5.1.0-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:9aa543980ab1fbf1720969af1d99095a548ea42e00361e727c58a40832439114"},
|
||||
{file = "lxml-5.1.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:6f11b77ec0979f7e4dc5ae081325a2946f1fe424148d3945f943ceaede98adb8"},
|
||||
{file = "lxml-5.1.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:a36c506e5f8aeb40680491d39ed94670487ce6614b9d27cabe45d94cd5d63e1e"},
|
||||
{file = "lxml-5.1.0-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f643ffd2669ffd4b5a3e9b41c909b72b2a1d5e4915da90a77e119b8d48ce867a"},
|
||||
{file = "lxml-5.1.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:16dd953fb719f0ffc5bc067428fc9e88f599e15723a85618c45847c96f11f431"},
|
||||
{file = "lxml-5.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:16018f7099245157564d7148165132c70adb272fb5a17c048ba70d9cc542a1a1"},
|
||||
{file = "lxml-5.1.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:82cd34f1081ae4ea2ede3d52f71b7be313756e99b4b5f829f89b12da552d3aa3"},
|
||||
{file = "lxml-5.1.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:19a1bc898ae9f06bccb7c3e1dfd73897ecbbd2c96afe9095a6026016e5ca97b8"},
|
||||
{file = "lxml-5.1.0-cp312-cp312-win32.whl", hash = "sha256:13521a321a25c641b9ea127ef478b580b5ec82aa2e9fc076c86169d161798b01"},
|
||||
{file = "lxml-5.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:1ad17c20e3666c035db502c78b86e58ff6b5991906e55bdbef94977700c72623"},
|
||||
{file = "lxml-5.1.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:24ef5a4631c0b6cceaf2dbca21687e29725b7c4e171f33a8f8ce23c12558ded1"},
|
||||
{file = "lxml-5.1.0-cp36-cp36m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8d2900b7f5318bc7ad8631d3d40190b95ef2aa8cc59473b73b294e4a55e9f30f"},
|
||||
{file = "lxml-5.1.0-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:601f4a75797d7a770daed8b42b97cd1bb1ba18bd51a9382077a6a247a12aa38d"},
|
||||
{file = "lxml-5.1.0-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b4b68c961b5cc402cbd99cca5eb2547e46ce77260eb705f4d117fd9c3f932b95"},
|
||||
{file = "lxml-5.1.0-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:afd825e30f8d1f521713a5669b63657bcfe5980a916c95855060048b88e1adb7"},
|
||||
{file = "lxml-5.1.0-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:262bc5f512a66b527d026518507e78c2f9c2bd9eb5c8aeeb9f0eb43fcb69dc67"},
|
||||
{file = "lxml-5.1.0-cp36-cp36m-win32.whl", hash = "sha256:e856c1c7255c739434489ec9c8aa9cdf5179785d10ff20add308b5d673bed5cd"},
|
||||
{file = "lxml-5.1.0-cp36-cp36m-win_amd64.whl", hash = "sha256:c7257171bb8d4432fe9d6fdde4d55fdbe663a63636a17f7f9aaba9bcb3153ad7"},
|
||||
{file = "lxml-5.1.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:b9e240ae0ba96477682aa87899d94ddec1cc7926f9df29b1dd57b39e797d5ab5"},
|
||||
{file = "lxml-5.1.0-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a96f02ba1bcd330807fc060ed91d1f7a20853da6dd449e5da4b09bfcc08fdcf5"},
|
||||
{file = "lxml-5.1.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3e3898ae2b58eeafedfe99e542a17859017d72d7f6a63de0f04f99c2cb125936"},
|
||||
{file = "lxml-5.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:61c5a7edbd7c695e54fca029ceb351fc45cd8860119a0f83e48be44e1c464862"},
|
||||
{file = "lxml-5.1.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:3aeca824b38ca78d9ee2ab82bd9883083d0492d9d17df065ba3b94e88e4d7ee6"},
|
||||
{file = "lxml-5.1.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8f52fe6859b9db71ee609b0c0a70fea5f1e71c3462ecf144ca800d3f434f0764"},
|
||||
{file = "lxml-5.1.0-cp37-cp37m-win32.whl", hash = "sha256:d42e3a3fc18acc88b838efded0e6ec3edf3e328a58c68fbd36a7263a874906c8"},
|
||||
{file = "lxml-5.1.0-cp37-cp37m-win_amd64.whl", hash = "sha256:eac68f96539b32fce2c9b47eb7c25bb2582bdaf1bbb360d25f564ee9e04c542b"},
|
||||
{file = "lxml-5.1.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:ae15347a88cf8af0949a9872b57a320d2605ae069bcdf047677318bc0bba45b1"},
|
||||
{file = "lxml-5.1.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:c26aab6ea9c54d3bed716b8851c8bfc40cb249b8e9880e250d1eddde9f709bf5"},
|
||||
{file = "lxml-5.1.0-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:342e95bddec3a698ac24378d61996b3ee5ba9acfeb253986002ac53c9a5f6f84"},
|
||||
{file = "lxml-5.1.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:725e171e0b99a66ec8605ac77fa12239dbe061482ac854d25720e2294652eeaa"},
|
||||
{file = "lxml-5.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3d184e0d5c918cff04cdde9dbdf9600e960161d773666958c9d7b565ccc60c45"},
|
||||
{file = "lxml-5.1.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:98f3f020a2b736566c707c8e034945c02aa94e124c24f77ca097c446f81b01f1"},
|
||||
{file = "lxml-5.1.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:6d48fc57e7c1e3df57be5ae8614bab6d4e7b60f65c5457915c26892c41afc59e"},
|
||||
{file = "lxml-5.1.0-cp38-cp38-win32.whl", hash = "sha256:7ec465e6549ed97e9f1e5ed51c657c9ede767bc1c11552f7f4d022c4df4a977a"},
|
||||
{file = "lxml-5.1.0-cp38-cp38-win_amd64.whl", hash = "sha256:b21b4031b53d25b0858d4e124f2f9131ffc1530431c6d1321805c90da78388d1"},
|
||||
{file = "lxml-5.1.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:52427a7eadc98f9e62cb1368a5079ae826f94f05755d2d567d93ee1bc3ceb354"},
|
||||
{file = "lxml-5.1.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6a2a2c724d97c1eb8cf966b16ca2915566a4904b9aad2ed9a09c748ffe14f969"},
|
||||
{file = "lxml-5.1.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:843b9c835580d52828d8f69ea4302537337a21e6b4f1ec711a52241ba4a824f3"},
|
||||
{file = "lxml-5.1.0-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9b99f564659cfa704a2dd82d0684207b1aadf7d02d33e54845f9fc78e06b7581"},
|
||||
{file = "lxml-5.1.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4f8b0c78e7aac24979ef09b7f50da871c2de2def043d468c4b41f512d831e912"},
|
||||
{file = "lxml-5.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9bcf86dfc8ff3e992fed847c077bd875d9e0ba2fa25d859c3a0f0f76f07f0c8d"},
|
||||
{file = "lxml-5.1.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:49a9b4af45e8b925e1cd6f3b15bbba2c81e7dba6dce170c677c9cda547411e14"},
|
||||
{file = "lxml-5.1.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:280f3edf15c2a967d923bcfb1f8f15337ad36f93525828b40a0f9d6c2ad24890"},
|
||||
{file = "lxml-5.1.0-cp39-cp39-win32.whl", hash = "sha256:ed7326563024b6e91fef6b6c7a1a2ff0a71b97793ac33dbbcf38f6005e51ff6e"},
|
||||
{file = "lxml-5.1.0-cp39-cp39-win_amd64.whl", hash = "sha256:8d7b4beebb178e9183138f552238f7e6613162a42164233e2bda00cb3afac58f"},
|
||||
{file = "lxml-5.1.0-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:9bd0ae7cc2b85320abd5e0abad5ccee5564ed5f0cc90245d2f9a8ef330a8deae"},
|
||||
{file = "lxml-5.1.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d8c1d679df4361408b628f42b26a5d62bd3e9ba7f0c0e7969f925021554755aa"},
|
||||
{file = "lxml-5.1.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:2ad3a8ce9e8a767131061a22cd28fdffa3cd2dc193f399ff7b81777f3520e372"},
|
||||
{file = "lxml-5.1.0-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:304128394c9c22b6569eba2a6d98392b56fbdfbad58f83ea702530be80d0f9df"},
|
||||
{file = "lxml-5.1.0-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d74fcaf87132ffc0447b3c685a9f862ffb5b43e70ea6beec2fb8057d5d2a1fea"},
|
||||
{file = "lxml-5.1.0-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:8cf5877f7ed384dabfdcc37922c3191bf27e55b498fecece9fd5c2c7aaa34c33"},
|
||||
{file = "lxml-5.1.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:877efb968c3d7eb2dad540b6cabf2f1d3c0fbf4b2d309a3c141f79c7e0061324"},
|
||||
{file = "lxml-5.1.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3f14a4fb1c1c402a22e6a341a24c1341b4a3def81b41cd354386dcb795f83897"},
|
||||
{file = "lxml-5.1.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:25663d6e99659544ee8fe1b89b1a8c0aaa5e34b103fab124b17fa958c4a324a6"},
|
||||
{file = "lxml-5.1.0-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:8b9f19df998761babaa7f09e6bc169294eefafd6149aaa272081cbddc7ba4ca3"},
|
||||
{file = "lxml-5.1.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5e53d7e6a98b64fe54775d23a7c669763451340c3d44ad5e3a3b48a1efbdc96f"},
|
||||
{file = "lxml-5.1.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:c3cd1fc1dc7c376c54440aeaaa0dcc803d2126732ff5c6b68ccd619f2e64be4f"},
|
||||
{file = "lxml-5.1.0.tar.gz", hash = "sha256:3eea6ed6e6c918e468e693c41ef07f3c3acc310b70ddd9cc72d9ef84bc9564ca"},
|
||||
{file = "lxml-4.9.4-cp27-cp27m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:e214025e23db238805a600f1f37bf9f9a15413c7bf5f9d6ae194f84980c78722"},
|
||||
{file = "lxml-4.9.4-cp27-cp27m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:ec53a09aee61d45e7dbe7e91252ff0491b6b5fee3d85b2d45b173d8ab453efc1"},
|
||||
{file = "lxml-4.9.4-cp27-cp27m-win32.whl", hash = "sha256:7d1d6c9e74c70ddf524e3c09d9dc0522aba9370708c2cb58680ea40174800013"},
|
||||
{file = "lxml-4.9.4-cp27-cp27m-win_amd64.whl", hash = "sha256:cb53669442895763e61df5c995f0e8361b61662f26c1b04ee82899c2789c8f69"},
|
||||
{file = "lxml-4.9.4-cp27-cp27mu-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:647bfe88b1997d7ae8d45dabc7c868d8cb0c8412a6e730a7651050b8c7289cf2"},
|
||||
{file = "lxml-4.9.4-cp27-cp27mu-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:4d973729ce04784906a19108054e1fd476bc85279a403ea1a72fdb051c76fa48"},
|
||||
{file = "lxml-4.9.4-cp310-cp310-macosx_11_0_x86_64.whl", hash = "sha256:056a17eaaf3da87a05523472ae84246f87ac2f29a53306466c22e60282e54ff8"},
|
||||
{file = "lxml-4.9.4-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:aaa5c173a26960fe67daa69aa93d6d6a1cd714a6eb13802d4e4bd1d24a530644"},
|
||||
{file = "lxml-4.9.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:647459b23594f370c1c01768edaa0ba0959afc39caeeb793b43158bb9bb6a663"},
|
||||
{file = "lxml-4.9.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:bdd9abccd0927673cffe601d2c6cdad1c9321bf3437a2f507d6b037ef91ea307"},
|
||||
{file = "lxml-4.9.4-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:00e91573183ad273e242db5585b52670eddf92bacad095ce25c1e682da14ed91"},
|
||||
{file = "lxml-4.9.4-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:a602ed9bd2c7d85bd58592c28e101bd9ff9c718fbde06545a70945ffd5d11868"},
|
||||
{file = "lxml-4.9.4-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:de362ac8bc962408ad8fae28f3967ce1a262b5d63ab8cefb42662566737f1dc7"},
|
||||
{file = "lxml-4.9.4-cp310-cp310-win32.whl", hash = "sha256:33714fcf5af4ff7e70a49731a7cc8fd9ce910b9ac194f66eaa18c3cc0a4c02be"},
|
||||
{file = "lxml-4.9.4-cp310-cp310-win_amd64.whl", hash = "sha256:d3caa09e613ece43ac292fbed513a4bce170681a447d25ffcbc1b647d45a39c5"},
|
||||
{file = "lxml-4.9.4-cp311-cp311-macosx_11_0_universal2.whl", hash = "sha256:359a8b09d712df27849e0bcb62c6a3404e780b274b0b7e4c39a88826d1926c28"},
|
||||
{file = "lxml-4.9.4-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:43498ea734ccdfb92e1886dfedaebeb81178a241d39a79d5351ba2b671bff2b2"},
|
||||
{file = "lxml-4.9.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:4855161013dfb2b762e02b3f4d4a21cc7c6aec13c69e3bffbf5022b3e708dd97"},
|
||||
{file = "lxml-4.9.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:c71b5b860c5215fdbaa56f715bc218e45a98477f816b46cfde4a84d25b13274e"},
|
||||
{file = "lxml-4.9.4-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:9a2b5915c333e4364367140443b59f09feae42184459b913f0f41b9fed55794a"},
|
||||
{file = "lxml-4.9.4-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:d82411dbf4d3127b6cde7da0f9373e37ad3a43e89ef374965465928f01c2b979"},
|
||||
{file = "lxml-4.9.4-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:273473d34462ae6e97c0f4e517bd1bf9588aa67a1d47d93f760a1282640e24ac"},
|
||||
{file = "lxml-4.9.4-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:389d2b2e543b27962990ab529ac6720c3dded588cc6d0f6557eec153305a3622"},
|
||||
{file = "lxml-4.9.4-cp311-cp311-win32.whl", hash = "sha256:8aecb5a7f6f7f8fe9cac0bcadd39efaca8bbf8d1bf242e9f175cbe4c925116c3"},
|
||||
{file = "lxml-4.9.4-cp311-cp311-win_amd64.whl", hash = "sha256:c7721a3ef41591341388bb2265395ce522aba52f969d33dacd822da8f018aff8"},
|
||||
{file = "lxml-4.9.4-cp312-cp312-macosx_11_0_universal2.whl", hash = "sha256:dbcb2dc07308453db428a95a4d03259bd8caea97d7f0776842299f2d00c72fc8"},
|
||||
{file = "lxml-4.9.4-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:01bf1df1db327e748dcb152d17389cf6d0a8c5d533ef9bab781e9d5037619229"},
|
||||
{file = "lxml-4.9.4-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:e8f9f93a23634cfafbad6e46ad7d09e0f4a25a2400e4a64b1b7b7c0fbaa06d9d"},
|
||||
{file = "lxml-4.9.4-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:3f3f00a9061605725df1816f5713d10cd94636347ed651abdbc75828df302b20"},
|
||||
{file = "lxml-4.9.4-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:953dd5481bd6252bd480d6ec431f61d7d87fdcbbb71b0d2bdcfc6ae00bb6fb10"},
|
||||
{file = "lxml-4.9.4-cp312-cp312-win32.whl", hash = "sha256:266f655d1baff9c47b52f529b5f6bec33f66042f65f7c56adde3fcf2ed62ae8b"},
|
||||
{file = "lxml-4.9.4-cp312-cp312-win_amd64.whl", hash = "sha256:f1faee2a831fe249e1bae9cbc68d3cd8a30f7e37851deee4d7962b17c410dd56"},
|
||||
{file = "lxml-4.9.4-cp35-cp35m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:23d891e5bdc12e2e506e7d225d6aa929e0a0368c9916c1fddefab88166e98b20"},
|
||||
{file = "lxml-4.9.4-cp35-cp35m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:e96a1788f24d03e8d61679f9881a883ecdf9c445a38f9ae3f3f193ab6c591c66"},
|
||||
{file = "lxml-4.9.4-cp36-cp36m-macosx_11_0_x86_64.whl", hash = "sha256:5557461f83bb7cc718bc9ee1f7156d50e31747e5b38d79cf40f79ab1447afd2d"},
|
||||
{file = "lxml-4.9.4-cp36-cp36m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:fdb325b7fba1e2c40b9b1db407f85642e32404131c08480dd652110fc908561b"},
|
||||
{file = "lxml-4.9.4-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3d74d4a3c4b8f7a1f676cedf8e84bcc57705a6d7925e6daef7a1e54ae543a197"},
|
||||
{file = "lxml-4.9.4-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:ac7674d1638df129d9cb4503d20ffc3922bd463c865ef3cb412f2c926108e9a4"},
|
||||
{file = "lxml-4.9.4-cp36-cp36m-manylinux_2_28_x86_64.whl", hash = "sha256:ddd92e18b783aeb86ad2132d84a4b795fc5ec612e3545c1b687e7747e66e2b53"},
|
||||
{file = "lxml-4.9.4-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2bd9ac6e44f2db368ef8986f3989a4cad3de4cd55dbdda536e253000c801bcc7"},
|
||||
{file = "lxml-4.9.4-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:bc354b1393dce46026ab13075f77b30e40b61b1a53e852e99d3cc5dd1af4bc85"},
|
||||
{file = "lxml-4.9.4-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:f836f39678cb47c9541f04d8ed4545719dc31ad850bf1832d6b4171e30d65d23"},
|
||||
{file = "lxml-4.9.4-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:9c131447768ed7bc05a02553d939e7f0e807e533441901dd504e217b76307745"},
|
||||
{file = "lxml-4.9.4-cp36-cp36m-win32.whl", hash = "sha256:bafa65e3acae612a7799ada439bd202403414ebe23f52e5b17f6ffc2eb98c2be"},
|
||||
{file = "lxml-4.9.4-cp36-cp36m-win_amd64.whl", hash = "sha256:6197c3f3c0b960ad033b9b7d611db11285bb461fc6b802c1dd50d04ad715c225"},
|
||||
{file = "lxml-4.9.4-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:7b378847a09d6bd46047f5f3599cdc64fcb4cc5a5a2dd0a2af610361fbe77b16"},
|
||||
{file = "lxml-4.9.4-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:1343df4e2e6e51182aad12162b23b0a4b3fd77f17527a78c53f0f23573663545"},
|
||||
{file = "lxml-4.9.4-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:6dbdacf5752fbd78ccdb434698230c4f0f95df7dd956d5f205b5ed6911a1367c"},
|
||||
{file = "lxml-4.9.4-cp37-cp37m-manylinux_2_28_x86_64.whl", hash = "sha256:506becdf2ecaebaf7f7995f776394fcc8bd8a78022772de66677c84fb02dd33d"},
|
||||
{file = "lxml-4.9.4-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ca8e44b5ba3edb682ea4e6185b49661fc22b230cf811b9c13963c9f982d1d964"},
|
||||
{file = "lxml-4.9.4-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:9d9d5726474cbbef279fd709008f91a49c4f758bec9c062dfbba88eab00e3ff9"},
|
||||
{file = "lxml-4.9.4-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:bbdd69e20fe2943b51e2841fc1e6a3c1de460d630f65bde12452d8c97209464d"},
|
||||
{file = "lxml-4.9.4-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8671622256a0859f5089cbe0ce4693c2af407bc053dcc99aadff7f5310b4aa02"},
|
||||
{file = "lxml-4.9.4-cp37-cp37m-win32.whl", hash = "sha256:dd4fda67f5faaef4f9ee5383435048ee3e11ad996901225ad7615bc92245bc8e"},
|
||||
{file = "lxml-4.9.4-cp37-cp37m-win_amd64.whl", hash = "sha256:6bee9c2e501d835f91460b2c904bc359f8433e96799f5c2ff20feebd9bb1e590"},
|
||||
{file = "lxml-4.9.4-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:1f10f250430a4caf84115b1e0f23f3615566ca2369d1962f82bef40dd99cd81a"},
|
||||
{file = "lxml-4.9.4-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:3b505f2bbff50d261176e67be24e8909e54b5d9d08b12d4946344066d66b3e43"},
|
||||
{file = "lxml-4.9.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:1449f9451cd53e0fd0a7ec2ff5ede4686add13ac7a7bfa6988ff6d75cff3ebe2"},
|
||||
{file = "lxml-4.9.4-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:4ece9cca4cd1c8ba889bfa67eae7f21d0d1a2e715b4d5045395113361e8c533d"},
|
||||
{file = "lxml-4.9.4-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:59bb5979f9941c61e907ee571732219fa4774d5a18f3fa5ff2df963f5dfaa6bc"},
|
||||
{file = "lxml-4.9.4-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:b1980dbcaad634fe78e710c8587383e6e3f61dbe146bcbfd13a9c8ab2d7b1192"},
|
||||
{file = "lxml-4.9.4-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:9ae6c3363261021144121427b1552b29e7b59de9d6a75bf51e03bc072efb3c37"},
|
||||
{file = "lxml-4.9.4-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:bcee502c649fa6351b44bb014b98c09cb00982a475a1912a9881ca28ab4f9cd9"},
|
||||
{file = "lxml-4.9.4-cp38-cp38-win32.whl", hash = "sha256:a8edae5253efa75c2fc79a90068fe540b197d1c7ab5803b800fccfe240eed33c"},
|
||||
{file = "lxml-4.9.4-cp38-cp38-win_amd64.whl", hash = "sha256:701847a7aaefef121c5c0d855b2affa5f9bd45196ef00266724a80e439220e46"},
|
||||
{file = "lxml-4.9.4-cp39-cp39-macosx_11_0_x86_64.whl", hash = "sha256:f610d980e3fccf4394ab3806de6065682982f3d27c12d4ce3ee46a8183d64a6a"},
|
||||
{file = "lxml-4.9.4-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:aa9b5abd07f71b081a33115d9758ef6077924082055005808f68feccb27616bd"},
|
||||
{file = "lxml-4.9.4-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:365005e8b0718ea6d64b374423e870648ab47c3a905356ab6e5a5ff03962b9a9"},
|
||||
{file = "lxml-4.9.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:16b9ec51cc2feab009e800f2c6327338d6ee4e752c76e95a35c4465e80390ccd"},
|
||||
{file = "lxml-4.9.4-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:a905affe76f1802edcac554e3ccf68188bea16546071d7583fb1b693f9cf756b"},
|
||||
{file = "lxml-4.9.4-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fd814847901df6e8de13ce69b84c31fc9b3fb591224d6762d0b256d510cbf382"},
|
||||
{file = "lxml-4.9.4-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:91bbf398ac8bb7d65a5a52127407c05f75a18d7015a270fdd94bbcb04e65d573"},
|
||||
{file = "lxml-4.9.4-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:f99768232f036b4776ce419d3244a04fe83784bce871b16d2c2e984c7fcea847"},
|
||||
{file = "lxml-4.9.4-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:bb5bd6212eb0edfd1e8f254585290ea1dadc3687dd8fd5e2fd9a87c31915cdab"},
|
||||
{file = "lxml-4.9.4-cp39-cp39-win32.whl", hash = "sha256:88f7c383071981c74ec1998ba9b437659e4fd02a3c4a4d3efc16774eb108d0ec"},
|
||||
{file = "lxml-4.9.4-cp39-cp39-win_amd64.whl", hash = "sha256:936e8880cc00f839aa4173f94466a8406a96ddce814651075f95837316369899"},
|
||||
{file = "lxml-4.9.4-pp310-pypy310_pp73-macosx_11_0_x86_64.whl", hash = "sha256:f6c35b2f87c004270fa2e703b872fcc984d714d430b305145c39d53074e1ffe0"},
|
||||
{file = "lxml-4.9.4-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:606d445feeb0856c2b424405236a01c71af7c97e5fe42fbc778634faef2b47e4"},
|
||||
{file = "lxml-4.9.4-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:a1bdcbebd4e13446a14de4dd1825f1e778e099f17f79718b4aeaf2403624b0f7"},
|
||||
{file = "lxml-4.9.4-pp37-pypy37_pp73-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:0a08c89b23117049ba171bf51d2f9c5f3abf507d65d016d6e0fa2f37e18c0fc5"},
|
||||
{file = "lxml-4.9.4-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:232fd30903d3123be4c435fb5159938c6225ee8607b635a4d3fca847003134ba"},
|
||||
{file = "lxml-4.9.4-pp37-pypy37_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:231142459d32779b209aa4b4d460b175cadd604fed856f25c1571a9d78114771"},
|
||||
{file = "lxml-4.9.4-pp38-pypy38_pp73-macosx_11_0_x86_64.whl", hash = "sha256:520486f27f1d4ce9654154b4494cf9307b495527f3a2908ad4cb48e4f7ed7ef7"},
|
||||
{file = "lxml-4.9.4-pp38-pypy38_pp73-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:562778586949be7e0d7435fcb24aca4810913771f845d99145a6cee64d5b67ca"},
|
||||
{file = "lxml-4.9.4-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:a9e7c6d89c77bb2770c9491d988f26a4b161d05c8ca58f63fb1f1b6b9a74be45"},
|
||||
{file = "lxml-4.9.4-pp38-pypy38_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:786d6b57026e7e04d184313c1359ac3d68002c33e4b1042ca58c362f1d09ff58"},
|
||||
{file = "lxml-4.9.4-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:95ae6c5a196e2f239150aa4a479967351df7f44800c93e5a975ec726fef005e2"},
|
||||
{file = "lxml-4.9.4-pp39-pypy39_pp73-macosx_11_0_x86_64.whl", hash = "sha256:9b556596c49fa1232b0fff4b0e69b9d4083a502e60e404b44341e2f8fb7187f5"},
|
||||
{file = "lxml-4.9.4-pp39-pypy39_pp73-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:cc02c06e9e320869d7d1bd323df6dd4281e78ac2e7f8526835d3d48c69060683"},
|
||||
{file = "lxml-4.9.4-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:857d6565f9aa3464764c2cb6a2e3c2e75e1970e877c188f4aeae45954a314e0c"},
|
||||
{file = "lxml-4.9.4-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:c42ae7e010d7d6bc51875d768110c10e8a59494855c3d4c348b068f5fb81fdcd"},
|
||||
{file = "lxml-4.9.4-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:f10250bb190fb0742e3e1958dd5c100524c2cc5096c67c8da51233f7448dc137"},
|
||||
{file = "lxml-4.9.4.tar.gz", hash = "sha256:b1541e50b78e15fa06a2670157a1962ef06591d4c998b998047fff5e3236880e"},
|
||||
]
|
||||
|
||||
[package.extras]
|
||||
|
|
@ -5488,13 +5711,13 @@ tests = ["pytest (>=5.4.1)", "pytest-cov (>=2.8.1)", "pytest-mypy (>=0.8.0)", "p
|
|||
|
||||
[[package]]
|
||||
name = "postgrest"
|
||||
version = "0.15.0"
|
||||
version = "0.16.0"
|
||||
description = "PostgREST client for Python. This library provides an ORM interface to PostgREST."
|
||||
optional = false
|
||||
python-versions = ">=3.8,<4.0"
|
||||
files = [
|
||||
{file = "postgrest-0.15.0-py3-none-any.whl", hash = "sha256:f405b3c4adfa3fe61732fabb1d5d7c55111159d25fc595663ea75ff992cafd5b"},
|
||||
{file = "postgrest-0.15.0.tar.gz", hash = "sha256:2e6b4b2b721be2c4e2dbc8de49f8b6a8ed74663b3b0f6b04976c04e222b283cb"},
|
||||
{file = "postgrest-0.16.0-py3-none-any.whl", hash = "sha256:6ea070b16ea336ad968c6ce07ddd82c2c2775607c7daf834e80ec0fbb9f42357"},
|
||||
{file = "postgrest-0.16.0.tar.gz", hash = "sha256:9256b07f312f59a7c9c291cd6c6f8dea2d29d83a8839d16881289c3848c772e3"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -6042,13 +6265,13 @@ files = [
|
|||
|
||||
[[package]]
|
||||
name = "pydantic"
|
||||
version = "2.6.2"
|
||||
version = "2.6.3"
|
||||
description = "Data validation using Python type hints"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "pydantic-2.6.2-py3-none-any.whl", hash = "sha256:37a5432e54b12fecaa1049c5195f3d860a10e01bdfd24f1840ef14bd0d3aeab3"},
|
||||
{file = "pydantic-2.6.2.tar.gz", hash = "sha256:a09be1c3d28f3abe37f8a78af58284b236a92ce520105ddc91a6d29ea1176ba7"},
|
||||
{file = "pydantic-2.6.3-py3-none-any.whl", hash = "sha256:72c6034df47f46ccdf81869fddb81aade68056003900a8724a4f160700016a2a"},
|
||||
{file = "pydantic-2.6.3.tar.gz", hash = "sha256:e07805c4c7f5c6826e33a1d4c9d47950d7eaf34868e2690f8594d2e30241f11f"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -7076,17 +7299,17 @@ files = [
|
|||
|
||||
[[package]]
|
||||
name = "redis"
|
||||
version = "5.0.1"
|
||||
version = "5.0.2"
|
||||
description = "Python client for Redis database and key-value store"
|
||||
optional = true
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "redis-5.0.1-py3-none-any.whl", hash = "sha256:ed4802971884ae19d640775ba3b03aa2e7bd5e8fb8dfaed2decce4d0fc48391f"},
|
||||
{file = "redis-5.0.1.tar.gz", hash = "sha256:0dab495cd5753069d3bc650a0dde8a8f9edde16fc5691b689a566eda58100d0f"},
|
||||
{file = "redis-5.0.2-py3-none-any.whl", hash = "sha256:4caa8e1fcb6f3c0ef28dba99535101d80934b7d4cd541bbb47f4a3826ee472d1"},
|
||||
{file = "redis-5.0.2.tar.gz", hash = "sha256:3f82cc80d350e93042c8e6e7a5d0596e4dd68715babffba79492733e1f367037"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
async-timeout = {version = ">=4.0.2", markers = "python_full_version <= \"3.11.2\""}
|
||||
async-timeout = ">=4.0.3"
|
||||
|
||||
[package.extras]
|
||||
hiredis = ["hiredis (>=1.0.0)"]
|
||||
|
|
@ -7235,13 +7458,13 @@ rsa = ["oauthlib[signedtoken] (>=3.0.0)"]
|
|||
|
||||
[[package]]
|
||||
name = "rich"
|
||||
version = "13.7.0"
|
||||
version = "13.7.1"
|
||||
description = "Render rich text, tables, progress bars, syntax highlighting, markdown and more to the terminal"
|
||||
optional = false
|
||||
python-versions = ">=3.7.0"
|
||||
files = [
|
||||
{file = "rich-13.7.0-py3-none-any.whl", hash = "sha256:6da14c108c4866ee9520bbffa71f6fe3962e193b7da68720583850cd4548e235"},
|
||||
{file = "rich-13.7.0.tar.gz", hash = "sha256:5cb5123b5cf9ee70584244246816e9114227e0b98ad9176eede6ad54bf5403fa"},
|
||||
{file = "rich-13.7.1-py3-none-any.whl", hash = "sha256:4edbae314f59eb482f54e9e30bf00d33350aaa94f4bfcd4e9e3110e64d0d7222"},
|
||||
{file = "rich-13.7.1.tar.gz", hash = "sha256:9be308cb1fe2f1f57d67ce99e95af38a1e2bc71ad9813b0e247cf7ffbcc3a432"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -7813,19 +8036,19 @@ test = ["pylint", "pytest", "pytest-black", "pytest-cov", "pytest-pylint"]
|
|||
|
||||
[[package]]
|
||||
name = "supabase"
|
||||
version = "2.3.7"
|
||||
version = "2.4.0"
|
||||
description = "Supabase client for Python."
|
||||
optional = false
|
||||
python-versions = ">=3.8,<4.0"
|
||||
files = [
|
||||
{file = "supabase-2.3.7-py3-none-any.whl", hash = "sha256:a4616aa9149231d20f6e61884b90b7e5bdbde0ef0c2f0c12ced14536f39055bc"},
|
||||
{file = "supabase-2.3.7.tar.gz", hash = "sha256:d70dc986b7cd2a97c1916da1fa0ea6abae25690521cc9dd78016ab0e0c07116e"},
|
||||
{file = "supabase-2.4.0-py3-none-any.whl", hash = "sha256:f2f02b0e7903247ef9e2b3cb5dde067924a19a068f1c8befbdf40fb091bf8dd3"},
|
||||
{file = "supabase-2.4.0.tar.gz", hash = "sha256:d51556d3884f2e6f4588c33f1fcac954d4304238253bc35e9a87fdd22c43bafb"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
gotrue = ">=1.3,<3.0"
|
||||
httpx = ">=0.24,<0.26"
|
||||
postgrest = ">=0.10.8,<0.16.0"
|
||||
postgrest = ">=0.10.8,<0.17.0"
|
||||
realtime = ">=1.0.0,<2.0.0"
|
||||
storage3 = ">=0.5.3,<0.8.0"
|
||||
supafunc = ">=0.3.1,<0.4.0"
|
||||
|
|
@ -8383,13 +8606,13 @@ files = [
|
|||
|
||||
[[package]]
|
||||
name = "types-pyopenssl"
|
||||
version = "24.0.0.20240130"
|
||||
version = "24.0.0.20240228"
|
||||
description = "Typing stubs for pyOpenSSL"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "types-pyOpenSSL-24.0.0.20240130.tar.gz", hash = "sha256:c812e5c1c35249f75ef5935708b2a997d62abf9745be222e5f94b9595472ab25"},
|
||||
{file = "types_pyOpenSSL-24.0.0.20240130-py3-none-any.whl", hash = "sha256:24a255458b5b8a7fca8139cf56f2a8ad5a4f1a5f711b73a5bb9cb50dc688fab5"},
|
||||
{file = "types-pyOpenSSL-24.0.0.20240228.tar.gz", hash = "sha256:cd990717d8aa3743ef0e73e0f462e64b54d90c304249232d48fece4f0f7c3c6a"},
|
||||
{file = "types_pyOpenSSL-24.0.0.20240228-py3-none-any.whl", hash = "sha256:a472cf877a873549175e81972f153f44e975302a3cf17381eb5f3d41ccfb75a4"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -8552,13 +8775,13 @@ devenv = ["check-manifest", "pytest (>=4.3)", "pytest-cov", "pytest-mock (>=3.3)
|
|||
|
||||
[[package]]
|
||||
name = "unstructured"
|
||||
version = "0.12.4"
|
||||
version = "0.12.5"
|
||||
description = "A library that prepares raw documents for downstream ML tasks."
|
||||
optional = false
|
||||
python-versions = ">=3.9.0,<3.12"
|
||||
files = [
|
||||
{file = "unstructured-0.12.4-py3-none-any.whl", hash = "sha256:f1aa046297a3afba3aa16895e513aca6a93802ef73b7a18080656435c4deb217"},
|
||||
{file = "unstructured-0.12.4.tar.gz", hash = "sha256:019cf52e9e2bfa286e61ffa0d7d336e1645280f9a0f165e697583143fcfe708a"},
|
||||
{file = "unstructured-0.12.5-py3-none-any.whl", hash = "sha256:cce7de36964f556810adb8728d0639d8e9b3ef4567443877609f3c66a54e24d1"},
|
||||
{file = "unstructured-0.12.5.tar.gz", hash = "sha256:5ea6c881049e7d98a88c07192bcb6ab750de41b4e3b594972ed1034bda99dbae"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -8585,6 +8808,7 @@ wrapt = "*"
|
|||
[package.extras]
|
||||
airtable = ["pyairtable"]
|
||||
all-docs = ["markdown", "msg-parser", "networkx", "onnx", "openpyxl", "pandas", "pdf2image", "pdfminer.six", "pikepdf", "pillow-heif", "pypandoc", "pypdf", "python-docx", "python-pptx (<=0.6.23)", "unstructured-inference (==0.7.23)", "unstructured.pytesseract (>=0.3.12)", "xlrd"]
|
||||
astra = ["astrapy"]
|
||||
azure = ["adlfs", "fsspec"]
|
||||
azure-cognitive-search = ["azure-search-documents"]
|
||||
bedrock = ["boto3", "langchain-community"]
|
||||
|
|
@ -9385,4 +9609,4 @@ local = ["ctransformers", "llama-cpp-python", "sentence-transformers"]
|
|||
[metadata]
|
||||
lock-version = "2.0"
|
||||
python-versions = ">=3.9,<3.12"
|
||||
content-hash = "e34d70b4ca2e9bdab5478d4b0b31dc39379c4506d1cc6962e378090570ce757c"
|
||||
content-hash = "25c6686705b9e1a5a01b48c85bfc8e04592f1cf91d69ff47d421d9d7a895e1df"
|
||||
|
|
|
|||
|
|
@ -61,7 +61,7 @@ python-multipart = "^0.0.7"
|
|||
sqlmodel = "^0.0.14"
|
||||
faiss-cpu = "^1.7.4"
|
||||
anthropic = "^0.15.0"
|
||||
orjson = "^3.9.3"
|
||||
orjson = "3.9.15"
|
||||
multiprocess = "^0.70.14"
|
||||
cachetools = "^5.3.1"
|
||||
types-cachetools = "^5.3.0.5"
|
||||
|
|
|
|||
|
|
@ -109,7 +109,11 @@ def version_callback(value: bool):
|
|||
@app.callback()
|
||||
def main_entry_point(
|
||||
version: bool = typer.Option(
|
||||
None, "--version", callback=version_callback, is_eager=True, help="Show the version and exit."
|
||||
None,
|
||||
"--version",
|
||||
callback=version_callback,
|
||||
is_eager=True,
|
||||
help="Show the version and exit.",
|
||||
),
|
||||
):
|
||||
"""
|
||||
|
|
|
|||
|
|
@ -63,7 +63,7 @@ version_path_separator = os # Use os.pathsep. Default configuration used for ne
|
|||
# This is the path to the db in the root of the project.
|
||||
# When the user runs the Langflow the database url will
|
||||
# be set dinamically.
|
||||
sqlalchemy.url = sqlite:///../../../langflow.db
|
||||
sqlalchemy.url = sqlite:///./langflow.db
|
||||
|
||||
|
||||
[post_write_hooks]
|
||||
|
|
@ -98,7 +98,7 @@ handlers =
|
|||
qualname = sqlalchemy.engine
|
||||
|
||||
[logger_alembic]
|
||||
level = INFO
|
||||
level = DEBUG
|
||||
handlers =
|
||||
qualname = alembic
|
||||
|
||||
|
|
|
|||
|
|
@ -1,10 +1,11 @@
|
|||
import os
|
||||
from logging.config import fileConfig
|
||||
|
||||
from sqlalchemy import engine_from_config
|
||||
from sqlalchemy import pool
|
||||
|
||||
from alembic import context
|
||||
from loguru import logger
|
||||
from sqlalchemy import engine_from_config, pool
|
||||
|
||||
from langflow.services.database.models import * # noqa
|
||||
from langflow.services.database.service import SQLModel
|
||||
|
||||
# this is the Alembic Config object, which provides
|
||||
|
|
@ -40,7 +41,8 @@ def run_migrations_offline() -> None:
|
|||
script output.
|
||||
|
||||
"""
|
||||
url = config.get_main_option("sqlalchemy.url")
|
||||
url = os.getenv("LANGFLOW_DATABASE_URL")
|
||||
url = url or config.get_main_option("sqlalchemy.url")
|
||||
context.configure(
|
||||
url=url,
|
||||
target_metadata=target_metadata,
|
||||
|
|
@ -60,12 +62,17 @@ def run_migrations_online() -> None:
|
|||
and associate a connection with the context.
|
||||
|
||||
"""
|
||||
connectable = engine_from_config(
|
||||
config.get_section(config.config_ini_section, {}),
|
||||
prefix="sqlalchemy.",
|
||||
poolclass=pool.NullPool,
|
||||
)
|
||||
from langflow.services.deps import get_db_service
|
||||
|
||||
try:
|
||||
connectable = get_db_service().engine
|
||||
except Exception as e:
|
||||
logger.error(f"Error getting database engine: {e}")
|
||||
connectable = engine_from_config(
|
||||
config.get_section(config.config_ini_section, {}),
|
||||
prefix="sqlalchemy.",
|
||||
poolclass=pool.NullPool,
|
||||
)
|
||||
with connectable.connect() as connection:
|
||||
context.configure(
|
||||
connection=connection, target_metadata=target_metadata, render_as_batch=True
|
||||
|
|
|
|||
|
|
@ -10,6 +10,7 @@ from typing import Sequence, Union
|
|||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
import sqlmodel
|
||||
from sqlalchemy.engine.reflection import Inspector
|
||||
${imports if imports else ""}
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
|
|
@ -20,8 +21,12 @@ depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
|
|||
|
||||
|
||||
def upgrade() -> None:
|
||||
conn = op.get_bind()
|
||||
inspector = Inspector.from_engine(conn) # type: ignore
|
||||
${upgrades if upgrades else "pass"}
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
conn = op.get_bind()
|
||||
inspector = Inspector.from_engine(conn) # type: ignore
|
||||
${downgrades if downgrades else "pass"}
|
||||
|
|
|
|||
|
|
@ -5,28 +5,43 @@ Revises: 1ef9c4f3765d
|
|||
Create Date: 2023-12-13 18:55:52.587360
|
||||
|
||||
"""
|
||||
|
||||
from typing import Sequence, Union
|
||||
|
||||
from alembic import op
|
||||
from sqlalchemy.engine.reflection import Inspector
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = '006b3990db50'
|
||||
down_revision: Union[str, None] = '1ef9c4f3765d'
|
||||
revision: str = "006b3990db50"
|
||||
down_revision: Union[str, None] = "1ef9c4f3765d"
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
conn = op.get_bind()
|
||||
inspector = Inspector.from_engine(conn) # type: ignore
|
||||
api_key_constraints = inspector.get_unique_constraints("apikey")
|
||||
flow_constraints = inspector.get_unique_constraints("flow")
|
||||
user_constraints = inspector.get_unique_constraints("user")
|
||||
try:
|
||||
with op.batch_alter_table('apikey', schema=None) as batch_op:
|
||||
batch_op.create_unique_constraint('uq_apikey_id', ['id'])
|
||||
if not any(
|
||||
constraint["name"] == "uq_apikey_id" for constraint in api_key_constraints
|
||||
):
|
||||
with op.batch_alter_table("apikey", schema=None) as batch_op:
|
||||
|
||||
with op.batch_alter_table('flow', schema=None) as batch_op:
|
||||
batch_op.create_unique_constraint('uq_flow_id', ['id'])
|
||||
|
||||
with op.batch_alter_table('user', schema=None) as batch_op:
|
||||
batch_op.create_unique_constraint('uq_user_id', ['id'])
|
||||
batch_op.create_unique_constraint("uq_apikey_id", ["id"])
|
||||
if not any(
|
||||
constraint["name"] == "uq_flow_id" for constraint in flow_constraints
|
||||
):
|
||||
with op.batch_alter_table("flow", schema=None) as batch_op:
|
||||
batch_op.create_unique_constraint("uq_flow_id", ["id"])
|
||||
if not any(
|
||||
constraint["name"] == "uq_user_id" for constraint in user_constraints
|
||||
):
|
||||
with op.batch_alter_table("user", schema=None) as batch_op:
|
||||
batch_op.create_unique_constraint("uq_user_id", ["id"])
|
||||
except Exception as e:
|
||||
print(e)
|
||||
pass
|
||||
|
|
@ -36,15 +51,24 @@ def upgrade() -> None:
|
|||
|
||||
def downgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
conn = op.get_bind()
|
||||
inspector = Inspector.from_engine(conn) # type: ignore
|
||||
api_key_constraints = inspector.get_unique_constraints("apikey")
|
||||
flow_constraints = inspector.get_unique_constraints("flow")
|
||||
user_constraints = inspector.get_unique_constraints("user")
|
||||
try:
|
||||
with op.batch_alter_table('user', schema=None) as batch_op:
|
||||
batch_op.drop_constraint('uq_user_id', type_='unique')
|
||||
if any(
|
||||
constraint["name"] == "uq_apikey_id" for constraint in api_key_constraints
|
||||
):
|
||||
with op.batch_alter_table("user", schema=None) as batch_op:
|
||||
batch_op.drop_constraint("uq_user_id", type_="unique")
|
||||
if any(constraint["name"] == "uq_flow_id" for constraint in flow_constraints):
|
||||
with op.batch_alter_table("flow", schema=None) as batch_op:
|
||||
batch_op.drop_constraint("uq_flow_id", type_="unique")
|
||||
if any(constraint["name"] == "uq_user_id" for constraint in user_constraints):
|
||||
|
||||
with op.batch_alter_table('flow', schema=None) as batch_op:
|
||||
batch_op.drop_constraint('uq_flow_id', type_='unique')
|
||||
|
||||
with op.batch_alter_table('apikey', schema=None) as batch_op:
|
||||
batch_op.drop_constraint('uq_apikey_id', type_='unique')
|
||||
with op.batch_alter_table("apikey", schema=None) as batch_op:
|
||||
batch_op.drop_constraint("uq_apikey_id", type_="unique")
|
||||
except Exception as e:
|
||||
print(e)
|
||||
pass
|
||||
|
|
|
|||
|
|
@ -5,67 +5,25 @@ Revises: 006b3990db50
|
|||
Create Date: 2024-01-17 10:32:56.686287
|
||||
|
||||
"""
|
||||
|
||||
from typing import Sequence, Union
|
||||
|
||||
from alembic import op
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = '0b8757876a7c'
|
||||
down_revision: Union[str, None] = '006b3990db50'
|
||||
revision: str = "0b8757876a7c"
|
||||
down_revision: Union[str, None] = "006b3990db50"
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
try:
|
||||
with op.batch_alter_table('apikey', schema=None) as batch_op:
|
||||
batch_op.create_index(batch_op.f('ix_apikey_api_key'), ['api_key'], unique=True)
|
||||
batch_op.create_index(batch_op.f('ix_apikey_name'), ['name'], unique=False)
|
||||
batch_op.create_index(batch_op.f('ix_apikey_user_id'), ['user_id'], unique=False)
|
||||
except Exception as e:
|
||||
print(e)
|
||||
pass
|
||||
try:
|
||||
with op.batch_alter_table('flow', schema=None) as batch_op:
|
||||
batch_op.create_index(batch_op.f('ix_flow_description'), ['description'], unique=False)
|
||||
batch_op.create_index(batch_op.f('ix_flow_name'), ['name'], unique=False)
|
||||
batch_op.create_index(batch_op.f('ix_flow_user_id'), ['user_id'], unique=False)
|
||||
except Exception as e:
|
||||
print(e)
|
||||
pass
|
||||
pass
|
||||
|
||||
try:
|
||||
with op.batch_alter_table('user', schema=None) as batch_op:
|
||||
batch_op.create_index(batch_op.f('ix_user_username'), ['username'], unique=True)
|
||||
except Exception as e:
|
||||
print(e)
|
||||
pass
|
||||
# ### end Alembic commands ###
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
try:
|
||||
with op.batch_alter_table('user', schema=None) as batch_op:
|
||||
batch_op.drop_index(batch_op.f('ix_user_username'))
|
||||
except Exception as e:
|
||||
print(e)
|
||||
pass
|
||||
try:
|
||||
with op.batch_alter_table('flow', schema=None) as batch_op:
|
||||
batch_op.drop_index(batch_op.f('ix_flow_user_id'))
|
||||
batch_op.drop_index(batch_op.f('ix_flow_name'))
|
||||
batch_op.drop_index(batch_op.f('ix_flow_description'))
|
||||
except Exception as e:
|
||||
print(e)
|
||||
pass
|
||||
try:
|
||||
with op.batch_alter_table('apikey', schema=None) as batch_op:
|
||||
batch_op.drop_index(batch_op.f('ix_apikey_user_id'))
|
||||
batch_op.drop_index(batch_op.f('ix_apikey_name'))
|
||||
batch_op.drop_index(batch_op.f('ix_apikey_api_key'))
|
||||
except Exception as e:
|
||||
print(e)
|
||||
pass
|
||||
# ### end Alembic commands ###
|
||||
|
||||
pass
|
||||
# ### end Alembic commands ###
|
||||
|
|
|
|||
|
|
@ -6,6 +6,7 @@ Revises: fd531f8868b1
|
|||
Create Date: 2023-12-04 15:00:27.968998
|
||||
|
||||
"""
|
||||
|
||||
from typing import Sequence, Union
|
||||
|
||||
import sqlalchemy as sa
|
||||
|
|
@ -13,8 +14,8 @@ import sqlmodel
|
|||
from alembic import op
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = '1ef9c4f3765d'
|
||||
down_revision: Union[str, None] = 'fd531f8868b1'
|
||||
revision: str = "1ef9c4f3765d"
|
||||
down_revision: Union[str, None] = "fd531f8868b1"
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
|
@ -22,10 +23,10 @@ depends_on: Union[str, Sequence[str], None] = None
|
|||
def upgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
try:
|
||||
with op.batch_alter_table('apikey', schema=None) as batch_op:
|
||||
batch_op.alter_column('name',
|
||||
existing_type=sqlmodel.sql.sqltypes.AutoString(),
|
||||
nullable=True)
|
||||
with op.batch_alter_table("apikey", schema=None) as batch_op:
|
||||
batch_op.alter_column(
|
||||
"name", existing_type=sqlmodel.sql.sqltypes.AutoString(), nullable=True
|
||||
)
|
||||
except Exception as e:
|
||||
pass
|
||||
# ### end Alembic commands ###
|
||||
|
|
@ -34,10 +35,8 @@ def upgrade() -> None:
|
|||
def downgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
try:
|
||||
with op.batch_alter_table('apikey', schema=None) as batch_op:
|
||||
batch_op.alter_column('name',
|
||||
existing_type=sa.VARCHAR(),
|
||||
nullable=False)
|
||||
with op.batch_alter_table("apikey", schema=None) as batch_op:
|
||||
batch_op.alter_column("name", existing_type=sa.VARCHAR(), nullable=False)
|
||||
except Exception as e:
|
||||
pass
|
||||
# ### end Alembic commands ###
|
||||
|
|
|
|||
|
|
@ -5,6 +5,7 @@ Revises:
|
|||
Create Date: 2023-08-27 19:49:02.681355
|
||||
|
||||
"""
|
||||
|
||||
from typing import Sequence, Union
|
||||
|
||||
import sqlalchemy as sa
|
||||
|
|
@ -33,7 +34,9 @@ def upgrade() -> None:
|
|||
if "ix_flowstyle_flow_id" in [
|
||||
index["name"] for index in inspector.get_indexes("flowstyle")
|
||||
]:
|
||||
op.drop_index("ix_flowstyle_flow_id", table_name="flowstyle")
|
||||
op.drop_index(
|
||||
"ix_flowstyle_flow_id", table_name="flowstyle", if_exists=True
|
||||
)
|
||||
|
||||
existing_indices_flow = []
|
||||
existing_fks_flow = []
|
||||
|
|
@ -80,8 +83,7 @@ def upgrade() -> None:
|
|||
sa.Column("api_key", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("user_id", sqlmodel.sql.sqltypes.GUID(), nullable=False),
|
||||
sa.ForeignKeyConstraint(
|
||||
["user_id"],
|
||||
["user.id"],
|
||||
["user_id"], ["user.id"], name="fk_apikey_user_id_user"
|
||||
),
|
||||
sa.PrimaryKeyConstraint("id", name="pk_apikey"),
|
||||
sa.UniqueConstraint("id", name="uq_apikey_id"),
|
||||
|
|
@ -103,8 +105,7 @@ def upgrade() -> None:
|
|||
sa.Column("id", sqlmodel.sql.sqltypes.GUID(), nullable=False),
|
||||
sa.Column("user_id", sqlmodel.sql.sqltypes.GUID(), nullable=False),
|
||||
sa.ForeignKeyConstraint(
|
||||
["user_id"],
|
||||
["user.id"],
|
||||
["user_id"], ["user.id"], name="fk_flow_user_id_user"
|
||||
),
|
||||
sa.PrimaryKeyConstraint("id", name="pk_flow"),
|
||||
sa.UniqueConstraint("id", name="uq_flow_id"),
|
||||
|
|
@ -151,21 +152,21 @@ def downgrade() -> None:
|
|||
existing_tables = inspector.get_table_names()
|
||||
if "flow" in existing_tables:
|
||||
with op.batch_alter_table("flow", schema=None) as batch_op:
|
||||
batch_op.drop_index(batch_op.f("ix_flow_user_id"))
|
||||
batch_op.drop_index(batch_op.f("ix_flow_name"))
|
||||
batch_op.drop_index(batch_op.f("ix_flow_description"))
|
||||
batch_op.drop_index(batch_op.f("ix_flow_user_id"), if_exists=True)
|
||||
batch_op.drop_index(batch_op.f("ix_flow_name"), if_exists=True)
|
||||
batch_op.drop_index(batch_op.f("ix_flow_description"), if_exists=True)
|
||||
|
||||
op.drop_table("flow")
|
||||
if "apikey" in existing_tables:
|
||||
with op.batch_alter_table("apikey", schema=None) as batch_op:
|
||||
batch_op.drop_index(batch_op.f("ix_apikey_user_id"))
|
||||
batch_op.drop_index(batch_op.f("ix_apikey_name"))
|
||||
batch_op.drop_index(batch_op.f("ix_apikey_api_key"))
|
||||
batch_op.drop_index(batch_op.f("ix_apikey_user_id"), if_exists=True)
|
||||
batch_op.drop_index(batch_op.f("ix_apikey_name"), if_exists=True)
|
||||
batch_op.drop_index(batch_op.f("ix_apikey_api_key"), if_exists=True)
|
||||
|
||||
op.drop_table("apikey")
|
||||
if "user" in existing_tables:
|
||||
with op.batch_alter_table("user", schema=None) as batch_op:
|
||||
batch_op.drop_index(batch_op.f("ix_user_username"))
|
||||
batch_op.drop_index(batch_op.f("ix_user_username"), if_exists=True)
|
||||
|
||||
op.drop_table("user")
|
||||
|
||||
|
|
|
|||
|
|
@ -5,34 +5,44 @@ Revises: 7d2162acc8b2
|
|||
Create Date: 2023-11-24 10:45:38.465302
|
||||
|
||||
"""
|
||||
|
||||
from typing import Sequence, Union
|
||||
|
||||
import sqlalchemy as sa
|
||||
import sqlmodel
|
||||
from alembic import op
|
||||
from sqlalchemy.engine.reflection import Inspector
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = '2ac71eb9c3ae'
|
||||
down_revision: Union[str, None] = '7d2162acc8b2'
|
||||
revision: str = "2ac71eb9c3ae"
|
||||
down_revision: Union[str, None] = "7d2162acc8b2"
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
conn = op.get_bind()
|
||||
inspector = Inspector.from_engine(conn) # type: ignore
|
||||
tables = inspector.get_table_names()
|
||||
try:
|
||||
op.create_table('credential',
|
||||
sa.Column('name', sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column('value', sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column('provider', sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column('user_id', sqlmodel.sql.sqltypes.GUID(), nullable=False),
|
||||
sa.Column('id', sqlmodel.sql.sqltypes.GUID(), nullable=False),
|
||||
sa.Column('created_at', sa.DateTime(), nullable=False),
|
||||
sa.Column('updated_at', sa.DateTime(), nullable=True),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
)
|
||||
if "credential" not in tables:
|
||||
op.create_table(
|
||||
"credential",
|
||||
sa.Column("name", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column("value", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column(
|
||||
"provider", sqlmodel.sql.sqltypes.AutoString(), nullable=True
|
||||
),
|
||||
sa.Column("user_id", sqlmodel.sql.sqltypes.GUID(), nullable=False),
|
||||
sa.Column("id", sqlmodel.sql.sqltypes.GUID(), nullable=False),
|
||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||
sa.Column("updated_at", sa.DateTime(), nullable=True),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
except Exception as e:
|
||||
print(e)
|
||||
|
||||
pass
|
||||
# ### end Alembic commands ###
|
||||
|
||||
|
|
@ -40,7 +50,7 @@ def upgrade() -> None:
|
|||
def downgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
try:
|
||||
op.drop_table('credential')
|
||||
op.drop_table("credential")
|
||||
except Exception as e:
|
||||
print(e)
|
||||
pass
|
||||
|
|
|
|||
|
|
@ -5,6 +5,7 @@ Revises: 260dbcc8b680
|
|||
Create Date: 2023-09-08 07:36:13.387318
|
||||
|
||||
"""
|
||||
|
||||
from typing import Sequence, Union
|
||||
|
||||
import sqlalchemy as sa
|
||||
|
|
@ -21,29 +22,36 @@ depends_on: Union[str, Sequence[str], None] = None
|
|||
|
||||
def upgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
conn = op.get_bind()
|
||||
inspector = Inspector.from_engine(conn) # type: ignore
|
||||
if "user" in inspector.get_table_names() and "profile_image" not in [
|
||||
column["name"] for column in inspector.get_columns("user")
|
||||
]:
|
||||
with op.batch_alter_table("user", schema=None) as batch_op:
|
||||
batch_op.add_column(
|
||||
sa.Column(
|
||||
"profile_image", sqlmodel.sql.sqltypes.AutoString(), nullable=True
|
||||
try:
|
||||
conn = op.get_bind()
|
||||
inspector = Inspector.from_engine(conn) # type: ignore
|
||||
if "user" in inspector.get_table_names() and "profile_image" not in [
|
||||
column["name"] for column in inspector.get_columns("user")
|
||||
]:
|
||||
with op.batch_alter_table("user", schema=None) as batch_op:
|
||||
batch_op.add_column(
|
||||
sa.Column(
|
||||
"profile_image",
|
||||
sqlmodel.sql.sqltypes.AutoString(),
|
||||
nullable=True,
|
||||
)
|
||||
)
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
print(e)
|
||||
# ### end Alembic commands ###
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
conn = op.get_bind()
|
||||
inspector = Inspector.from_engine(conn) # type: ignore
|
||||
if "user" in inspector.get_table_names() and "profile_image" in [
|
||||
column["name"] for column in inspector.get_columns("user")
|
||||
]:
|
||||
with op.batch_alter_table("user", schema=None) as batch_op:
|
||||
batch_op.drop_column("profile_image")
|
||||
try:
|
||||
conn = op.get_bind()
|
||||
inspector = Inspector.from_engine(conn) # type: ignore
|
||||
if "user" in inspector.get_table_names() and "profile_image" in [
|
||||
column["name"] for column in inspector.get_columns("user")
|
||||
]:
|
||||
with op.batch_alter_table("user", schema=None) as batch_op:
|
||||
batch_op.drop_column("profile_image")
|
||||
except Exception as e:
|
||||
print(e)
|
||||
|
||||
# ### end Alembic commands ###
|
||||
|
|
|
|||
|
|
@ -5,12 +5,13 @@ Revises: eb5866d51fd2
|
|||
Create Date: 2023-10-18 23:08:57.744906
|
||||
|
||||
"""
|
||||
|
||||
from typing import Sequence, Union
|
||||
|
||||
import sqlalchemy as sa
|
||||
import sqlmodel
|
||||
from alembic import op
|
||||
from loguru import logger
|
||||
from sqlalchemy.engine.reflection import Inspector
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = "7843803a87b5"
|
||||
|
|
@ -21,19 +22,26 @@ depends_on: Union[str, Sequence[str], None] = None
|
|||
|
||||
def upgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
conn = op.get_bind()
|
||||
inspector = Inspector.from_engine(conn) # type: ignore
|
||||
flow_columns = [column["name"] for column in inspector.get_columns("flow")]
|
||||
user_columns = [column["name"] for column in inspector.get_columns("user")]
|
||||
try:
|
||||
with op.batch_alter_table("flow", schema=None) as batch_op:
|
||||
batch_op.add_column(sa.Column("is_component", sa.Boolean(), nullable=True))
|
||||
|
||||
with op.batch_alter_table("user", schema=None) as batch_op:
|
||||
batch_op.add_column(
|
||||
sa.Column(
|
||||
"store_api_key", sqlmodel.AutoString(), nullable=True
|
||||
if "is_component" not in flow_columns:
|
||||
with op.batch_alter_table("flow", schema=None) as batch_op:
|
||||
batch_op.add_column(
|
||||
sa.Column("is_component", sa.Boolean(), nullable=True)
|
||||
)
|
||||
)
|
||||
except Exception as e:
|
||||
logger.exception(e)
|
||||
|
||||
pass
|
||||
try:
|
||||
if "store_api_key" not in user_columns:
|
||||
with op.batch_alter_table("user", schema=None) as batch_op:
|
||||
batch_op.add_column(
|
||||
sa.Column("store_api_key", sqlmodel.AutoString(), nullable=True)
|
||||
)
|
||||
except Exception as e:
|
||||
pass
|
||||
# ### end Alembic commands ###
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -5,88 +5,74 @@ Revises: f5ee9749d1a6
|
|||
Create Date: 2023-11-21 20:56:53.998781
|
||||
|
||||
"""
|
||||
|
||||
from typing import Sequence, Union
|
||||
|
||||
import sqlalchemy as sa
|
||||
import sqlmodel
|
||||
from alembic import op
|
||||
from sqlalchemy.engine.reflection import Inspector
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = '7d2162acc8b2'
|
||||
down_revision: Union[str, None] = 'f5ee9749d1a6'
|
||||
revision: str = "7d2162acc8b2"
|
||||
down_revision: Union[str, None] = "f5ee9749d1a6"
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
conn = op.get_bind()
|
||||
inspector = Inspector.from_engine(conn) # type: ignore
|
||||
api_key_columns = [column["name"] for column in inspector.get_columns("apikey")]
|
||||
flow_columns = [column["name"] for column in inspector.get_columns("flow")]
|
||||
|
||||
try:
|
||||
with op.batch_alter_table('component', schema=None) as batch_op:
|
||||
batch_op.drop_index('ix_component_frontend_node_id')
|
||||
batch_op.drop_index('ix_component_name')
|
||||
op.drop_table('component')
|
||||
op.drop_table('flowstyle')
|
||||
if "name" in api_key_columns:
|
||||
with op.batch_alter_table("apikey", schema=None) as batch_op:
|
||||
batch_op.alter_column(
|
||||
"name", existing_type=sa.VARCHAR(), nullable=False
|
||||
)
|
||||
except Exception as e:
|
||||
print(e)
|
||||
pass
|
||||
with op.batch_alter_table('apikey', schema=None) as batch_op:
|
||||
batch_op.alter_column('name',
|
||||
existing_type=sa.VARCHAR(),
|
||||
nullable=False)
|
||||
|
||||
with op.batch_alter_table('flow', schema=None) as batch_op:
|
||||
batch_op.add_column(sa.Column('updated_at', sa.DateTime(), nullable=True))
|
||||
batch_op.add_column(sa.Column('folder', sqlmodel.sql.sqltypes.AutoString(), nullable=True))
|
||||
pass
|
||||
try:
|
||||
with op.batch_alter_table("flow", schema=None) as batch_op:
|
||||
if "updated_at" not in flow_columns:
|
||||
batch_op.add_column(
|
||||
sa.Column("updated_at", sa.DateTime(), nullable=True)
|
||||
)
|
||||
if "folder" not in flow_columns:
|
||||
batch_op.add_column(
|
||||
sa.Column(
|
||||
"folder", sqlmodel.sql.sqltypes.AutoString(), nullable=True
|
||||
)
|
||||
)
|
||||
except Exception as e:
|
||||
print(e)
|
||||
|
||||
pass
|
||||
|
||||
# ### end Alembic commands ###
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
try:
|
||||
with op.batch_alter_table('flow', schema=None) as batch_op:
|
||||
batch_op.drop_column('folder')
|
||||
batch_op.drop_column('updated_at')
|
||||
with op.batch_alter_table("flow", schema=None) as batch_op:
|
||||
batch_op.drop_column("folder")
|
||||
batch_op.drop_column("updated_at")
|
||||
except Exception as e:
|
||||
print(e)
|
||||
pass
|
||||
|
||||
try:
|
||||
|
||||
with op.batch_alter_table('apikey', schema=None) as batch_op:
|
||||
batch_op.alter_column('name',
|
||||
existing_type=sa.VARCHAR(),
|
||||
nullable=True)
|
||||
with op.batch_alter_table("apikey", schema=None) as batch_op:
|
||||
batch_op.alter_column("name", existing_type=sa.VARCHAR(), nullable=True)
|
||||
except Exception as e:
|
||||
print(e)
|
||||
pass
|
||||
try:
|
||||
op.create_table('flowstyle',
|
||||
sa.Column('color', sa.VARCHAR(), nullable=False),
|
||||
sa.Column('emoji', sa.VARCHAR(), nullable=False),
|
||||
sa.Column('flow_id', sa.CHAR(length=32), nullable=True),
|
||||
sa.Column('id', sa.CHAR(length=32), nullable=False),
|
||||
sa.ForeignKeyConstraint(['flow_id'], ['flow.id'], ),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
sa.UniqueConstraint('id')
|
||||
)
|
||||
op.create_table('component',
|
||||
sa.Column('id', sa.CHAR(length=32), nullable=False),
|
||||
sa.Column('frontend_node_id', sa.CHAR(length=32), nullable=False),
|
||||
sa.Column('name', sa.VARCHAR(), nullable=False),
|
||||
sa.Column('description', sa.VARCHAR(), nullable=True),
|
||||
sa.Column('python_code', sa.VARCHAR(), nullable=True),
|
||||
sa.Column('return_type', sa.VARCHAR(), nullable=True),
|
||||
sa.Column('is_disabled', sa.BOOLEAN(), nullable=False),
|
||||
sa.Column('is_read_only', sa.BOOLEAN(), nullable=False),
|
||||
sa.Column('create_at', sa.DATETIME(), nullable=False),
|
||||
sa.Column('update_at', sa.DATETIME(), nullable=False),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
|
||||
with op.batch_alter_table('component', schema=None) as batch_op:
|
||||
batch_op.create_index('ix_component_name', ['name'], unique=False)
|
||||
batch_op.create_index('ix_component_frontend_node_id', ['frontend_node_id'], unique=False)
|
||||
except Exception as e:
|
||||
print(e)
|
||||
pass
|
||||
# ### end Alembic commands ###
|
||||
|
|
|
|||
|
|
@ -5,55 +5,105 @@ Revises: 0b8757876a7c
|
|||
Create Date: 2024-01-26 13:31:14.797548
|
||||
|
||||
"""
|
||||
|
||||
from typing import Sequence, Union
|
||||
|
||||
import sqlalchemy as sa
|
||||
import sqlmodel
|
||||
from alembic import op
|
||||
from sqlalchemy.engine.reflection import Inspector
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = 'b2fa308044b5'
|
||||
down_revision: Union[str, None] = '0b8757876a7c'
|
||||
revision: str = "b2fa308044b5"
|
||||
down_revision: Union[str, None] = "0b8757876a7c"
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
conn = op.get_bind()
|
||||
inspector = Inspector.from_engine(conn) # type: ignore
|
||||
tables = inspector.get_table_names()
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
try:
|
||||
op.drop_table('flowstyle')
|
||||
with op.batch_alter_table('flow', schema=None) as batch_op:
|
||||
batch_op.add_column(sa.Column('is_component', sa.Boolean(), nullable=True))
|
||||
batch_op.add_column(sa.Column('updated_at', sa.DateTime(), nullable=True))
|
||||
batch_op.add_column(sa.Column('folder', sqlmodel.sql.sqltypes.AutoString(), nullable=True))
|
||||
batch_op.add_column(sa.Column('user_id', sqlmodel.sql.sqltypes.GUID(), nullable=True))
|
||||
batch_op.create_index(batch_op.f('ix_flow_user_id'), ['user_id'], unique=False)
|
||||
batch_op.create_foreign_key('fk_flow_user_id_user', 'user', ['user_id'], ['id'])
|
||||
if "flowstyle" in tables:
|
||||
op.drop_table("flowstyle")
|
||||
with op.batch_alter_table("flow", schema=None) as batch_op:
|
||||
flow_columns = [column["name"] for column in inspector.get_columns("flow")]
|
||||
if "is_component" not in flow_columns:
|
||||
batch_op.add_column(
|
||||
sa.Column("is_component", sa.Boolean(), nullable=True)
|
||||
)
|
||||
if "updated_at" not in flow_columns:
|
||||
batch_op.add_column(
|
||||
sa.Column("updated_at", sa.DateTime(), nullable=True)
|
||||
)
|
||||
if "folder" not in flow_columns:
|
||||
batch_op.add_column(
|
||||
sa.Column(
|
||||
"folder", sqlmodel.sql.sqltypes.AutoString(), nullable=True
|
||||
)
|
||||
)
|
||||
if "user_id" not in flow_columns:
|
||||
batch_op.add_column(
|
||||
sa.Column("user_id", sqlmodel.sql.sqltypes.GUID(), nullable=True)
|
||||
)
|
||||
indices = inspector.get_indexes("flow")
|
||||
indices_names = [index["name"] for index in indices]
|
||||
if "ix_flow_user_id" not in indices_names:
|
||||
batch_op.create_index(
|
||||
batch_op.f("ix_flow_user_id"), ["user_id"], unique=False
|
||||
)
|
||||
if "fk_flow_user_id_user" not in indices_names:
|
||||
batch_op.create_foreign_key(
|
||||
"fk_flow_user_id_user", "user", ["user_id"], ["id"]
|
||||
)
|
||||
|
||||
except Exception:
|
||||
pass
|
||||
# ### end Alembic commands ###
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
conn = op.get_bind()
|
||||
inspector = Inspector.from_engine(conn) # type: ignore
|
||||
try:
|
||||
with op.batch_alter_table('flow', schema=None) as batch_op:
|
||||
batch_op.drop_constraint('fk_flow_user_id_user', type_='foreignkey')
|
||||
batch_op.drop_index(batch_op.f('ix_flow_user_id'))
|
||||
batch_op.drop_column('user_id')
|
||||
batch_op.drop_column('folder')
|
||||
batch_op.drop_column('updated_at')
|
||||
batch_op.drop_column('is_component')
|
||||
# Re-create the dropped table 'flowstyle' if it was previously dropped in upgrade
|
||||
if "flowstyle" not in inspector.get_table_names():
|
||||
op.create_table(
|
||||
"flowstyle",
|
||||
sa.Column("color", sa.String(), nullable=False),
|
||||
sa.Column("emoji", sa.String(), nullable=False),
|
||||
sa.Column("flow_id", sqlmodel.sql.sqltypes.GUID(), nullable=True),
|
||||
sa.Column("id", sqlmodel.sql.sqltypes.GUID(), nullable=False),
|
||||
sa.ForeignKeyConstraint(["flow_id"], ["flow.id"]),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
sa.UniqueConstraint("id"),
|
||||
)
|
||||
|
||||
op.create_table('flowstyle',
|
||||
sa.Column('color', sa.VARCHAR(), nullable=False),
|
||||
sa.Column('emoji', sa.VARCHAR(), nullable=False),
|
||||
sa.Column('flow_id', sa.CHAR(length=32), nullable=True),
|
||||
sa.Column('id', sa.CHAR(length=32), nullable=False),
|
||||
sa.ForeignKeyConstraint(['flow_id'], ['flow.id'], ),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
sa.UniqueConstraint('id')
|
||||
)
|
||||
except Exception:
|
||||
pass
|
||||
# ### end Alembic commands ###
|
||||
with op.batch_alter_table("flow", schema=None) as batch_op:
|
||||
# Check and remove newly added columns and constraints in upgrade
|
||||
flow_columns = [column["name"] for column in inspector.get_columns("flow")]
|
||||
if "user_id" in flow_columns:
|
||||
batch_op.drop_column("user_id")
|
||||
if "folder" in flow_columns:
|
||||
batch_op.drop_column("folder")
|
||||
if "updated_at" in flow_columns:
|
||||
batch_op.drop_column("updated_at")
|
||||
if "is_component" in flow_columns:
|
||||
batch_op.drop_column("is_component")
|
||||
|
||||
indices = inspector.get_indexes("flow")
|
||||
indices_names = [index["name"] for index in indices]
|
||||
if "ix_flow_user_id" in indices_names:
|
||||
batch_op.drop_index("ix_flow_user_id")
|
||||
# Assuming fk_flow_user_id_user is a foreign key constraint's name, not an index
|
||||
constraints = inspector.get_foreign_keys("flow")
|
||||
constraint_names = [constraint["name"] for constraint in constraints]
|
||||
if "fk_flow_user_id_user" in constraint_names:
|
||||
batch_op.drop_constraint("fk_flow_user_id_user", type_="foreignkey")
|
||||
|
||||
except Exception as e:
|
||||
# It's generally a good idea to log the exception or handle it in a way other than a bare pass
|
||||
print(f"Error during downgrade: {e}")
|
||||
|
|
|
|||
|
|
@ -5,46 +5,68 @@ Revises: b2fa308044b5
|
|||
Create Date: 2024-01-26 13:34:14.496769
|
||||
|
||||
"""
|
||||
|
||||
from typing import Sequence, Union
|
||||
|
||||
import sqlalchemy as sa
|
||||
import sqlmodel
|
||||
from alembic import op
|
||||
from sqlalchemy.engine.reflection import Inspector
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = 'bc2f01c40e4a'
|
||||
down_revision: Union[str, None] = 'b2fa308044b5'
|
||||
revision: str = "bc2f01c40e4a"
|
||||
down_revision: Union[str, None] = "b2fa308044b5"
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
try:
|
||||
with op.batch_alter_table('flow', schema=None) as batch_op:
|
||||
batch_op.add_column(sa.Column('is_component', sa.Boolean(), nullable=True))
|
||||
batch_op.add_column(sa.Column('updated_at', sa.DateTime(), nullable=True))
|
||||
batch_op.add_column(sa.Column('folder', sqlmodel.sql.sqltypes.AutoString(), nullable=True))
|
||||
batch_op.add_column(sa.Column('user_id', sqlmodel.sql.sqltypes.GUID(), nullable=True))
|
||||
batch_op.create_index(batch_op.f('ix_flow_user_id'), ['user_id'], unique=False)
|
||||
batch_op.create_foreign_key('flow_user_id_fkey'
|
||||
, 'user', ['user_id'], ['id'])
|
||||
except Exception:
|
||||
pass
|
||||
# ### end Alembic commands ###
|
||||
conn = op.get_bind()
|
||||
inspector = Inspector.from_engine(conn) # type: ignore
|
||||
flow_columns = {column["name"] for column in inspector.get_columns("flow")}
|
||||
flow_indexes = {index["name"] for index in inspector.get_indexes("flow")}
|
||||
flow_fks = {fk["name"] for fk in inspector.get_foreign_keys("flow")}
|
||||
|
||||
with op.batch_alter_table("flow", schema=None) as batch_op:
|
||||
if "is_component" not in flow_columns:
|
||||
batch_op.add_column(sa.Column("is_component", sa.Boolean(), nullable=True))
|
||||
if "updated_at" not in flow_columns:
|
||||
batch_op.add_column(sa.Column("updated_at", sa.DateTime(), nullable=True))
|
||||
if "folder" not in flow_columns:
|
||||
batch_op.add_column(
|
||||
sa.Column("folder", sqlmodel.sql.sqltypes.AutoString(), nullable=True)
|
||||
)
|
||||
if "user_id" not in flow_columns:
|
||||
batch_op.add_column(
|
||||
sa.Column("user_id", sqlmodel.sql.sqltypes.GUID(), nullable=True)
|
||||
)
|
||||
if "ix_flow_user_id" not in flow_indexes:
|
||||
batch_op.create_index(
|
||||
batch_op.f("ix_flow_user_id"), ["user_id"], unique=False
|
||||
)
|
||||
if "flow_user_id_fkey" not in flow_fks:
|
||||
batch_op.create_foreign_key(
|
||||
"flow_user_id_fkey", "user", ["user_id"], ["id"]
|
||||
)
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
try:
|
||||
with op.batch_alter_table('flow', schema=None) as batch_op:
|
||||
batch_op.drop_constraint('flow_user_id_fkey', type_='foreignkey')
|
||||
batch_op.drop_index(batch_op.f('ix_flow_user_id'))
|
||||
batch_op.drop_column('user_id')
|
||||
batch_op.drop_column('folder')
|
||||
batch_op.drop_column('updated_at')
|
||||
batch_op.drop_column('is_component')
|
||||
except Exception:
|
||||
pass
|
||||
conn = op.get_bind()
|
||||
inspector = Inspector.from_engine(conn) # type: ignore
|
||||
flow_columns = {column["name"] for column in inspector.get_columns("flow")}
|
||||
flow_indexes = {index["name"] for index in inspector.get_indexes("flow")}
|
||||
flow_fks = {fk["name"] for fk in inspector.get_foreign_keys("flow")}
|
||||
|
||||
# ### end Alembic commands ###
|
||||
with op.batch_alter_table("flow", schema=None) as batch_op:
|
||||
if "flow_user_id_fkey" in flow_fks:
|
||||
batch_op.drop_constraint("flow_user_id_fkey", type_="foreignkey")
|
||||
if "ix_flow_user_id" in flow_indexes:
|
||||
batch_op.drop_index(batch_op.f("ix_flow_user_id"))
|
||||
if "user_id" in flow_columns:
|
||||
batch_op.drop_column("user_id")
|
||||
if "folder" in flow_columns:
|
||||
batch_op.drop_column("folder")
|
||||
if "updated_at" in flow_columns:
|
||||
batch_op.drop_column("updated_at")
|
||||
if "is_component" in flow_columns:
|
||||
batch_op.drop_column("is_component")
|
||||
|
|
|
|||
|
|
@ -5,11 +5,10 @@ Revises: 67cc006d50bf
|
|||
Create Date: 2023-10-04 10:18:25.640458
|
||||
|
||||
"""
|
||||
|
||||
from typing import Sequence, Union
|
||||
|
||||
import sqlalchemy as sa
|
||||
from alembic import op
|
||||
from sqlalchemy import exc
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = "eb5866d51fd2"
|
||||
|
|
@ -21,70 +20,12 @@ depends_on: Union[str, Sequence[str], None] = None
|
|||
def upgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
connection = op.get_bind()
|
||||
try:
|
||||
op.drop_table("flowstyle")
|
||||
with op.batch_alter_table("component", schema=None) as batch_op:
|
||||
batch_op.drop_index("ix_component_frontend_node_id")
|
||||
batch_op.drop_index("ix_component_name")
|
||||
except exc.SQLAlchemyError:
|
||||
# connection.execute(text("ROLLBACK"))
|
||||
pass
|
||||
except Exception as e:
|
||||
print(e)
|
||||
pass
|
||||
|
||||
try:
|
||||
op.drop_table("component")
|
||||
except exc.SQLAlchemyError:
|
||||
# connection.execute(text("ROLLBACK"))
|
||||
pass
|
||||
except Exception as e:
|
||||
print(e)
|
||||
pass
|
||||
pass
|
||||
# ### end Alembic commands ###
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
try:
|
||||
op.create_table(
|
||||
"component",
|
||||
sa.Column("id", sa.CHAR(length=32), nullable=False),
|
||||
sa.Column("frontend_node_id", sa.CHAR(length=32), nullable=False),
|
||||
sa.Column("name", sa.VARCHAR(), nullable=False),
|
||||
sa.Column("description", sa.VARCHAR(), nullable=True),
|
||||
sa.Column("python_code", sa.VARCHAR(), nullable=True),
|
||||
sa.Column("return_type", sa.VARCHAR(), nullable=True),
|
||||
sa.Column("is_disabled", sa.BOOLEAN(), nullable=False),
|
||||
sa.Column("is_read_only", sa.BOOLEAN(), nullable=False),
|
||||
sa.Column("create_at", sa.DATETIME(), nullable=False),
|
||||
sa.Column("update_at", sa.DATETIME(), nullable=False),
|
||||
sa.PrimaryKeyConstraint("id", name="pk_component"),
|
||||
)
|
||||
with op.batch_alter_table("component", schema=None) as batch_op:
|
||||
batch_op.create_index("ix_component_name", ["name"], unique=False)
|
||||
batch_op.create_index(
|
||||
"ix_component_frontend_node_id", ["frontend_node_id"], unique=False
|
||||
)
|
||||
except Exception as e:
|
||||
print(e)
|
||||
pass
|
||||
|
||||
try:
|
||||
op.create_table(
|
||||
"flowstyle",
|
||||
sa.Column("color", sa.VARCHAR(), nullable=False),
|
||||
sa.Column("emoji", sa.VARCHAR(), nullable=False),
|
||||
sa.Column("flow_id", sa.CHAR(length=32), nullable=True),
|
||||
sa.Column("id", sa.CHAR(length=32), nullable=False),
|
||||
sa.ForeignKeyConstraint(
|
||||
["flow_id"],
|
||||
["flow.id"],
|
||||
),
|
||||
sa.PrimaryKeyConstraint("id", name="pk_flowstyle"),
|
||||
sa.UniqueConstraint("id", name="uq_flowstyle_id"),
|
||||
)
|
||||
except Exception as e:
|
||||
print(e)
|
||||
pass
|
||||
pass
|
||||
# ### end Alembic commands ###
|
||||
|
|
|
|||
|
|
@ -5,6 +5,7 @@ Revises: 7843803a87b5
|
|||
Create Date: 2023-10-18 23:12:27.297016
|
||||
|
||||
"""
|
||||
|
||||
from typing import Sequence, Union
|
||||
|
||||
import sqlalchemy as sa
|
||||
|
|
|
|||
|
|
@ -5,22 +5,35 @@ Revises: 2ac71eb9c3ae
|
|||
Create Date: 2023-11-24 15:07:37.566516
|
||||
|
||||
"""
|
||||
from typing import Sequence, Union
|
||||
|
||||
from typing import Optional, Sequence, Union
|
||||
|
||||
from alembic import op
|
||||
from sqlalchemy.engine.reflection import Inspector
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = 'fd531f8868b1'
|
||||
down_revision: Union[str, None] = '2ac71eb9c3ae'
|
||||
revision: str = "fd531f8868b1"
|
||||
down_revision: Union[str, None] = "2ac71eb9c3ae"
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
conn = op.get_bind()
|
||||
inspector = Inspector.from_engine(conn) # type: ignore
|
||||
tables = inspector.get_table_names()
|
||||
foreign_keys_names = []
|
||||
if "credential" in tables:
|
||||
foreign_keys = inspector.get_foreign_keys("credential")
|
||||
foreign_keys_names = [fk["name"] for fk in foreign_keys]
|
||||
|
||||
try:
|
||||
with op.batch_alter_table('credential', schema=None) as batch_op:
|
||||
batch_op.create_foreign_key("fk_credential_user_id", 'user', ['user_id'], ['id'])
|
||||
if "credential" in tables and "fk_credential_user_id" not in foreign_keys_names:
|
||||
with op.batch_alter_table("credential", schema=None) as batch_op:
|
||||
batch_op.create_foreign_key(
|
||||
"fk_credential_user_id", "user", ["user_id"], ["id"]
|
||||
)
|
||||
except Exception as e:
|
||||
print(e)
|
||||
pass
|
||||
|
|
@ -30,9 +43,17 @@ def upgrade() -> None:
|
|||
|
||||
def downgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
conn = op.get_bind()
|
||||
inspector = Inspector.from_engine(conn) # type: ignore
|
||||
tables = inspector.get_table_names()
|
||||
foreign_keys_names: list[Optional[str]] = []
|
||||
if "credential" in tables:
|
||||
foreign_keys = inspector.get_foreign_keys("credential")
|
||||
foreign_keys_names = [fk["name"] for fk in foreign_keys]
|
||||
try:
|
||||
with op.batch_alter_table('credential', schema=None) as batch_op:
|
||||
batch_op.drop_constraint("fk_credential_user_id", type_='foreignkey')
|
||||
if "credential" in tables and "fk_credential_user_id" in foreign_keys_names:
|
||||
with op.batch_alter_table("credential", schema=None) as batch_op:
|
||||
batch_op.drop_constraint("fk_credential_user_id", type_="foreignkey")
|
||||
except Exception as e:
|
||||
print(e)
|
||||
pass
|
||||
|
|
|
|||
|
|
@ -17,6 +17,11 @@ if TYPE_CHECKING:
|
|||
class AsyncStreamingLLMCallbackHandleSIO(AsyncCallbackHandler):
|
||||
"""Callback handler for streaming LLM responses."""
|
||||
|
||||
@property
|
||||
def ignore_chain(self) -> bool:
|
||||
"""Whether to ignore chain callbacks."""
|
||||
return False
|
||||
|
||||
def __init__(self, session_id: str):
|
||||
self.chat_service = get_chat_service()
|
||||
self.client_id = session_id
|
||||
|
|
@ -28,7 +33,9 @@ class AsyncStreamingLLMCallbackHandleSIO(AsyncCallbackHandler):
|
|||
resp = ChatResponse(message=token, type="stream", intermediate_steps="")
|
||||
await self.socketio_service.emit_token(to=self.sid, data=resp.model_dump())
|
||||
|
||||
async def on_tool_start(self, serialized: Dict[str, Any], input_str: str, **kwargs: Any) -> Any:
|
||||
async def on_tool_start(
|
||||
self, serialized: Dict[str, Any], input_str: str, **kwargs: Any
|
||||
) -> Any:
|
||||
"""Run when tool starts running."""
|
||||
resp = ChatResponse(
|
||||
message="",
|
||||
|
|
@ -66,7 +73,9 @@ class AsyncStreamingLLMCallbackHandleSIO(AsyncCallbackHandler):
|
|||
try:
|
||||
# This is to emulate the stream of tokens
|
||||
for resp in resps:
|
||||
await self.socketio_service.emit_token(to=self.sid, data=resp.model_dump())
|
||||
await self.socketio_service.emit_token(
|
||||
to=self.sid, data=resp.model_dump()
|
||||
)
|
||||
except Exception as exc:
|
||||
logger.error(f"Error sending response: {exc}")
|
||||
|
||||
|
|
@ -92,7 +101,9 @@ class AsyncStreamingLLMCallbackHandleSIO(AsyncCallbackHandler):
|
|||
resp = PromptResponse(
|
||||
prompt=text,
|
||||
)
|
||||
await self.socketio_service.emit_message(to=self.sid, data=resp.model_dump())
|
||||
await self.socketio_service.emit_message(
|
||||
to=self.sid, data=resp.model_dump()
|
||||
)
|
||||
|
||||
async def on_agent_action(self, action: AgentAction, **kwargs: Any):
|
||||
log = f"Thought: {action.log}"
|
||||
|
|
@ -102,7 +113,9 @@ class AsyncStreamingLLMCallbackHandleSIO(AsyncCallbackHandler):
|
|||
logs = log.split("\n")
|
||||
for log in logs:
|
||||
resp = ChatResponse(message="", type="stream", intermediate_steps=log)
|
||||
await self.socketio_service.emit_token(to=self.sid, data=resp.model_dump())
|
||||
await self.socketio_service.emit_token(
|
||||
to=self.sid, data=resp.model_dump()
|
||||
)
|
||||
else:
|
||||
resp = ChatResponse(message="", type="stream", intermediate_steps=log)
|
||||
await self.socketio_service.emit_token(to=self.sid, data=resp.model_dump())
|
||||
|
|
|
|||
51
src/backend/langflow/components/tools/SearchApi.py
Normal file
51
src/backend/langflow/components/tools/SearchApi.py
Normal file
|
|
@ -0,0 +1,51 @@
|
|||
from langflow import CustomComponent
|
||||
from langchain.schema import Document
|
||||
from langflow.services.database.models.base import orjson_dumps
|
||||
from langchain_community.utilities.searchapi import SearchApiAPIWrapper
|
||||
from typing import Optional
|
||||
|
||||
|
||||
class SearchApi(CustomComponent):
|
||||
display_name: str = "SearchApi"
|
||||
description: str = "Real-time search engine results API."
|
||||
output_types: list[str] = ["Document"]
|
||||
documentation: str = "https://www.searchapi.io/docs/google"
|
||||
field_config = {
|
||||
"engine": {
|
||||
"display_name": "Engine",
|
||||
"field_type": "str",
|
||||
"info": "The search engine to use.",
|
||||
},
|
||||
"params": {
|
||||
"display_name": "Parameters",
|
||||
"info": "The parameters to send with the request.",
|
||||
},
|
||||
"code": {"show": False},
|
||||
"api_key": {
|
||||
"display_name": "API Key",
|
||||
"field_type": "str",
|
||||
"required": True,
|
||||
"password": True,
|
||||
"info": "The API key to use SearchApi.",
|
||||
},
|
||||
}
|
||||
|
||||
def build(
|
||||
self,
|
||||
engine: str,
|
||||
api_key: str,
|
||||
params: Optional[dict] = None,
|
||||
) -> Document:
|
||||
if params is None:
|
||||
params = {}
|
||||
|
||||
search_api_wrapper = SearchApiAPIWrapper(engine=engine, searchapi_api_key=api_key)
|
||||
|
||||
q = params.pop("q", "SearchApi Langflow")
|
||||
results = search_api_wrapper.results(q, **params)
|
||||
|
||||
result = orjson_dumps(results, indent_2=False)
|
||||
|
||||
document = Document(page_content=result)
|
||||
|
||||
return document
|
||||
|
|
@ -3,11 +3,10 @@ from typing import Any
|
|||
from langchain.agents import AgentExecutor
|
||||
from langchain.chains.base import Chain
|
||||
from langchain_core.runnables import Runnable
|
||||
from loguru import logger
|
||||
|
||||
from langflow.api.v1.schemas import ChatMessage
|
||||
from langflow.interface.utils import try_setting_streaming_options
|
||||
from langflow.processing.base import get_result_and_steps
|
||||
from loguru import logger
|
||||
|
||||
LANGCHAIN_RUNNABLES = (Chain, Runnable, AgentExecutor)
|
||||
|
||||
|
|
@ -23,7 +22,9 @@ async def process_graph(
|
|||
|
||||
if build_result is None:
|
||||
# Raise user facing error
|
||||
raise ValueError("There was an error loading the langchain_object. Please, check all the nodes and try again.")
|
||||
raise ValueError(
|
||||
"There was an error loading the langchain_object. Please, check all the nodes and try again."
|
||||
)
|
||||
|
||||
# Generate result and thought
|
||||
try:
|
||||
|
|
@ -39,7 +40,6 @@ async def process_graph(
|
|||
client_id=client_id,
|
||||
session_id=session_id,
|
||||
)
|
||||
|
||||
else:
|
||||
raise TypeError(f"Unknown type {type(build_result)}")
|
||||
logger.debug("Generated result and intermediate_steps")
|
||||
|
|
@ -50,5 +50,7 @@ async def process_graph(
|
|||
raise e
|
||||
|
||||
|
||||
async def run_build_result(build_result: Any, chat_inputs: ChatMessage, client_id: str, session_id: str):
|
||||
async def run_build_result(
|
||||
build_result: Any, chat_inputs: ChatMessage, client_id: str, session_id: str
|
||||
):
|
||||
return build_result(inputs=chat_inputs.message)
|
||||
|
|
|
|||
|
|
@ -111,13 +111,11 @@ class DatabaseService(Service):
|
|||
|
||||
return True
|
||||
|
||||
def init_alembic(self):
|
||||
def init_alembic(self, alembic_cfg):
|
||||
logger.info("Initializing alembic")
|
||||
alembic_cfg = Config()
|
||||
alembic_cfg.set_main_option("script_location", str(self.script_location))
|
||||
alembic_cfg.set_main_option("sqlalchemy.url", self.database_url)
|
||||
command.stamp(alembic_cfg, "head")
|
||||
# command.upgrade(alembic_cfg, "head")
|
||||
command.ensure_version(alembic_cfg)
|
||||
# alembic_cfg.attributes["connection"].commit()
|
||||
command.upgrade(alembic_cfg, "head")
|
||||
logger.info("Alembic initialized")
|
||||
|
||||
def run_migrations(self, fix=False):
|
||||
|
|
@ -126,6 +124,11 @@ class DatabaseService(Service):
|
|||
# if not self.script_location.exists(): # this is not the correct way to check if alembic has been initialized
|
||||
# We need to check if the alembic_version table exists
|
||||
# if not, we need to initialize alembic
|
||||
alembic_cfg = Config()
|
||||
# alembic_cfg.attributes["connection"] = session
|
||||
alembic_cfg.set_main_option("script_location", str(self.script_location))
|
||||
alembic_cfg.set_main_option("sqlalchemy.url", self.database_url)
|
||||
should_initialize_alembic = False
|
||||
with Session(self.engine) as session:
|
||||
# If the table does not exist it throws an error
|
||||
# so we need to catch it
|
||||
|
|
@ -133,18 +136,19 @@ class DatabaseService(Service):
|
|||
session.exec(text("SELECT * FROM alembic_version"))
|
||||
except Exception:
|
||||
logger.info("Alembic not initialized")
|
||||
try:
|
||||
self.init_alembic()
|
||||
except Exception as exc:
|
||||
logger.error(f"Error initializing alembic: {exc}")
|
||||
raise RuntimeError("Error initializing alembic") from exc
|
||||
should_initialize_alembic = True
|
||||
|
||||
else:
|
||||
logger.info("Alembic already initialized")
|
||||
if should_initialize_alembic:
|
||||
try:
|
||||
self.init_alembic(alembic_cfg)
|
||||
except Exception as exc:
|
||||
logger.error(f"Error initializing alembic: {exc}")
|
||||
raise RuntimeError("Error initializing alembic") from exc
|
||||
|
||||
logger.info(f"Running DB migrations in {self.script_location}")
|
||||
alembic_cfg = Config()
|
||||
alembic_cfg.set_main_option("script_location", str(self.script_location))
|
||||
alembic_cfg.set_main_option("sqlalchemy.url", self.database_url)
|
||||
|
||||
try:
|
||||
command.check(alembic_cfg)
|
||||
except Exception as exc:
|
||||
|
|
@ -155,7 +159,7 @@ class DatabaseService(Service):
|
|||
try:
|
||||
command.check(alembic_cfg)
|
||||
except util.exc.AutogenerateDiffsDetected as e:
|
||||
logger.exception("AutogenerateDiffsDetected: {exc}")
|
||||
logger.error("AutogenerateDiffsDetected: {exc}")
|
||||
if not fix:
|
||||
raise RuntimeError(
|
||||
"Something went wrong running migrations. Please, run `langflow migration --fix`"
|
||||
|
|
|
|||
|
|
@ -5,7 +5,10 @@ from langflow.services.auth.utils import create_super_user, verify_password
|
|||
from langflow.services.database.utils import initialize_database
|
||||
from langflow.services.manager import service_manager
|
||||
from langflow.services.schema import ServiceType
|
||||
from langflow.services.settings.constants import DEFAULT_SUPERUSER, DEFAULT_SUPERUSER_PASSWORD
|
||||
from langflow.services.settings.constants import (
|
||||
DEFAULT_SUPERUSER,
|
||||
DEFAULT_SUPERUSER_PASSWORD,
|
||||
)
|
||||
from langflow.services.socket.utils import set_socketio_server
|
||||
|
||||
from .deps import get_db_service, get_session, get_settings_service
|
||||
|
|
@ -19,7 +22,9 @@ def get_factories_and_deps():
|
|||
from langflow.services.database import factory as database_factory
|
||||
from langflow.services.monitor import factory as monitor_factory
|
||||
from langflow.services.plugins import factory as plugins_factory
|
||||
from langflow.services.session import factory as session_service_factory # type: ignore
|
||||
from langflow.services.session import (
|
||||
factory as session_service_factory,
|
||||
) # type: ignore
|
||||
from langflow.services.settings import factory as settings_factory
|
||||
from langflow.services.socket import factory as socket_factory
|
||||
from langflow.services.storage import factory as storage_factory
|
||||
|
|
@ -48,8 +53,14 @@ def get_factories_and_deps():
|
|||
),
|
||||
(plugins_factory.PluginServiceFactory(), [ServiceType.SETTINGS_SERVICE]),
|
||||
(store_factory.StoreServiceFactory(), [ServiceType.SETTINGS_SERVICE]),
|
||||
(credentials_factory.CredentialServiceFactory(), [ServiceType.SETTINGS_SERVICE]),
|
||||
(storage_factory.StorageServiceFactory(), [ServiceType.SESSION_SERVICE, ServiceType.SETTINGS_SERVICE]),
|
||||
(
|
||||
credentials_factory.CredentialServiceFactory(),
|
||||
[ServiceType.SETTINGS_SERVICE],
|
||||
),
|
||||
(
|
||||
storage_factory.StorageServiceFactory(),
|
||||
[ServiceType.SESSION_SERVICE, ServiceType.SETTINGS_SERVICE],
|
||||
),
|
||||
(monitor_factory.MonitorServiceFactory(), [ServiceType.SETTINGS_SERVICE]),
|
||||
(socket_factory.SocketIOFactory(), [ServiceType.CACHE_SERVICE]),
|
||||
]
|
||||
|
|
@ -81,12 +92,16 @@ def get_or_create_super_user(session: Session, username, password, is_default):
|
|||
)
|
||||
return None
|
||||
else:
|
||||
logger.debug("User with superuser credentials exists but is not a superuser.")
|
||||
logger.debug(
|
||||
"User with superuser credentials exists but is not a superuser."
|
||||
)
|
||||
return None
|
||||
|
||||
if user:
|
||||
if verify_password(password, user.password):
|
||||
raise ValueError("User with superuser credentials exists but is not a superuser.")
|
||||
raise ValueError(
|
||||
"User with superuser credentials exists but is not a superuser."
|
||||
)
|
||||
else:
|
||||
raise ValueError("Incorrect superuser credentials")
|
||||
|
||||
|
|
@ -115,15 +130,21 @@ def setup_superuser(settings_service, session: Session):
|
|||
username = settings_service.auth_settings.SUPERUSER
|
||||
password = settings_service.auth_settings.SUPERUSER_PASSWORD
|
||||
|
||||
is_default = (username == DEFAULT_SUPERUSER) and (password == DEFAULT_SUPERUSER_PASSWORD)
|
||||
is_default = (username == DEFAULT_SUPERUSER) and (
|
||||
password == DEFAULT_SUPERUSER_PASSWORD
|
||||
)
|
||||
|
||||
try:
|
||||
user = get_or_create_super_user(session=session, username=username, password=password, is_default=is_default)
|
||||
user = get_or_create_super_user(
|
||||
session=session, username=username, password=password, is_default=is_default
|
||||
)
|
||||
if user is not None:
|
||||
logger.debug("Superuser created successfully.")
|
||||
except Exception as exc:
|
||||
logger.exception(exc)
|
||||
raise RuntimeError("Could not create superuser. Please create a superuser manually.") from exc
|
||||
raise RuntimeError(
|
||||
"Could not create superuser. Please create a superuser manually."
|
||||
) from exc
|
||||
finally:
|
||||
settings_service.auth_settings.reset_credentials()
|
||||
|
||||
|
|
@ -137,7 +158,9 @@ def teardown_superuser(settings_service, session):
|
|||
|
||||
if not settings_service.auth_settings.AUTO_LOGIN:
|
||||
try:
|
||||
logger.debug("AUTO_LOGIN is set to False. Removing default superuser if exists.")
|
||||
logger.debug(
|
||||
"AUTO_LOGIN is set to False. Removing default superuser if exists."
|
||||
)
|
||||
username = DEFAULT_SUPERUSER
|
||||
from langflow.services.database.models.user.model import User
|
||||
|
||||
|
|
@ -181,11 +204,15 @@ def initialize_session_service():
|
|||
Initialize the session manager.
|
||||
"""
|
||||
from langflow.services.cache import factory as cache_factory
|
||||
from langflow.services.session import factory as session_service_factory # type: ignore
|
||||
from langflow.services.session import (
|
||||
factory as session_service_factory,
|
||||
) # type: ignore
|
||||
|
||||
initialize_settings_service()
|
||||
|
||||
service_manager.register_factory(cache_factory.CacheServiceFactory(), dependencies=[ServiceType.SETTINGS_SERVICE])
|
||||
service_manager.register_factory(
|
||||
cache_factory.CacheServiceFactory(), dependencies=[ServiceType.SETTINGS_SERVICE]
|
||||
)
|
||||
|
||||
service_manager.register_factory(
|
||||
session_service_factory.SessionServiceFactory(),
|
||||
|
|
@ -202,7 +229,9 @@ def initialize_services(fix_migration: bool = False, socketio_server=None):
|
|||
service_manager.register_factory(factory, dependencies=dependencies)
|
||||
except Exception as exc:
|
||||
logger.exception(exc)
|
||||
raise RuntimeError("Could not initialize services. Please check your settings.") from exc
|
||||
raise RuntimeError(
|
||||
"Could not initialize services. Please check your settings."
|
||||
) from exc
|
||||
|
||||
# Test cache connection
|
||||
service_manager.get(ServiceType.CACHE_SERVICE)
|
||||
|
|
@ -210,9 +239,11 @@ def initialize_services(fix_migration: bool = False, socketio_server=None):
|
|||
try:
|
||||
initialize_database(fix_migration=fix_migration)
|
||||
except Exception as exc:
|
||||
logger.exception(exc)
|
||||
logger.error(exc)
|
||||
raise exc
|
||||
setup_superuser(service_manager.get(ServiceType.SETTINGS_SERVICE), next(get_session()))
|
||||
setup_superuser(
|
||||
service_manager.get(ServiceType.SETTINGS_SERVICE), next(get_session())
|
||||
)
|
||||
try:
|
||||
get_db_service().migrate_flows_if_auto_login()
|
||||
except Exception as exc:
|
||||
|
|
|
|||
|
|
@ -1,36 +1,63 @@
|
|||
import { XCircle } from "lucide-react";
|
||||
import { crashComponentPropsType } from "../../types/components";
|
||||
import { Button } from "../ui/button";
|
||||
import { Card, CardContent, CardFooter, CardHeader } from "../ui/card";
|
||||
|
||||
export default function CrashErrorComponent({
|
||||
error,
|
||||
resetErrorBoundary,
|
||||
}: crashComponentPropsType): JSX.Element {
|
||||
return (
|
||||
<div className="fixed left-0 top-0 z-50 flex h-full w-full items-center justify-center bg-foreground bg-opacity-50">
|
||||
<div className="flex h-1/3 min-h-fit max-w-4xl flex-col justify-evenly rounded-lg bg-background p-8 text-start shadow-lg">
|
||||
<h1 className="mb-4 text-3xl text-status-red">
|
||||
Oops! An unknown error has occurred.
|
||||
</h1>
|
||||
<p className="mb-4 text-xl text-foreground">
|
||||
Please click the 'Reset Application' button to restore the
|
||||
application's state. If the error persists, please create an issue on
|
||||
our GitHub page. We apologize for any inconvenience this may have
|
||||
caused.
|
||||
</p>
|
||||
<div className="flex justify-center">
|
||||
<button
|
||||
onClick={resetErrorBoundary}
|
||||
className="mr-4 rounded bg-primary px-4 py-2 font-bold text-background hover:bg-ring"
|
||||
>
|
||||
Reset Application
|
||||
</button>
|
||||
<a
|
||||
href="https://github.com/logspace-ai/langflow/issues/new"
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
className="rounded bg-status-red px-4 py-2 font-bold text-background hover:bg-error-foreground"
|
||||
>
|
||||
Create Issue
|
||||
</a>
|
||||
<div className="z-50 flex h-screen w-screen items-center justify-center bg-foreground bg-opacity-50">
|
||||
<div className="flex h-screen w-screen flex-col bg-background text-start shadow-lg">
|
||||
<div className="m-auto grid w-1/2 justify-center gap-5 text-center">
|
||||
<Card className="p-8">
|
||||
<CardHeader>
|
||||
<div className="m-auto">
|
||||
<XCircle strokeWidth={1.5} className="h-16 w-16" />
|
||||
</div>
|
||||
<div>
|
||||
<p className="mb-4 text-xl text-foreground">
|
||||
Sorry, we found an unexpected error!
|
||||
</p>
|
||||
</div>
|
||||
</CardHeader>
|
||||
|
||||
<CardContent className="grid">
|
||||
<div>
|
||||
<p>
|
||||
Please report errors with detailed tracebacks on the{" "}
|
||||
<a
|
||||
href="https://github.com/logspace-ai/langflow/issues"
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
className="font-medium hover:underline "
|
||||
>
|
||||
GitHub Issues
|
||||
</a>{" "}
|
||||
page.
|
||||
<br></br>
|
||||
Thank you!
|
||||
</p>
|
||||
</div>
|
||||
</CardContent>
|
||||
|
||||
<CardFooter>
|
||||
<div className="m-auto mt-4 flex justify-center">
|
||||
<Button onClick={resetErrorBoundary}>Restart Langflow</Button>
|
||||
|
||||
<a
|
||||
href="https://github.com/logspace-ai/langflow/issues/new"
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
>
|
||||
<Button className="ml-3" variant={"outline"}>
|
||||
Report on GitHub
|
||||
</Button>
|
||||
</a>
|
||||
</div>
|
||||
</CardFooter>
|
||||
</Card>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
|
|
|||
|
|
@ -7,6 +7,7 @@ import {
|
|||
requestLogout,
|
||||
} from "../controllers/API";
|
||||
import useAlertStore from "../stores/alertStore";
|
||||
import useFlowsManagerStore from "../stores/flowsManagerStore";
|
||||
import { Users } from "../types/api";
|
||||
import { AuthContextType } from "../types/contexts/auth";
|
||||
|
||||
|
|
@ -79,6 +80,7 @@ export function AuthProvider({ children }): React.ReactElement {
|
|||
getUser();
|
||||
} else {
|
||||
setLoading(false);
|
||||
useFlowsManagerStore.setState({ isLoading: false });
|
||||
}
|
||||
});
|
||||
}, [setUserData, setLoading, autoLogin, setIsAdmin]);
|
||||
|
|
|
|||
|
|
@ -16,7 +16,7 @@ export const useTypesStore = create<TypesStoreType>((set, get) => ({
|
|||
setLoading(true);
|
||||
getAll()
|
||||
.then((response) => {
|
||||
const data = response.data;
|
||||
const data = response?.data;
|
||||
useAlertStore.setState({ loading: false });
|
||||
set((old) => ({
|
||||
types: typesGenerator(data),
|
||||
|
|
|
|||
|
|
@ -29,7 +29,10 @@ def poll_task_status(client, headers, href, max_attempts=20, sleep_time=1):
|
|||
href,
|
||||
headers=headers,
|
||||
)
|
||||
if task_status_response.status_code == 200 and task_status_response.json()["status"] == "SUCCESS":
|
||||
if (
|
||||
task_status_response.status_code == 200
|
||||
and task_status_response.json()["status"] == "SUCCESS"
|
||||
):
|
||||
return task_status_response.json()
|
||||
time.sleep(sleep_time)
|
||||
return None # Return None if task did not complete in time
|
||||
|
|
@ -123,7 +126,11 @@ def created_api_key(active_user):
|
|||
)
|
||||
db_manager = get_db_service()
|
||||
with session_getter(db_manager) as session:
|
||||
if existing_api_key := session.query(ApiKey).filter(ApiKey.api_key == api_key.api_key).first():
|
||||
if (
|
||||
existing_api_key := session.query(ApiKey)
|
||||
.filter(ApiKey.api_key == api_key.api_key)
|
||||
.first()
|
||||
):
|
||||
return existing_api_key
|
||||
session.add(api_key)
|
||||
session.commit()
|
||||
|
|
@ -289,7 +296,11 @@ def test_get_all(client: TestClient, logged_in_headers):
|
|||
dir_reader = DirectoryReader(settings.COMPONENTS_PATH[0])
|
||||
files = dir_reader.get_files()
|
||||
# json_response is a dict of dicts
|
||||
all_names = [component_name for _, components in response.json().items() for component_name in components]
|
||||
all_names = [
|
||||
component_name
|
||||
for _, components in response.json().items()
|
||||
for component_name in components
|
||||
]
|
||||
json_response = response.json()
|
||||
# We need to test the custom nodes
|
||||
assert len(all_names) > len(files)
|
||||
|
|
@ -414,13 +425,19 @@ def test_various_prompts(client, prompt, expected_input_variables):
|
|||
|
||||
|
||||
def test_get_vertices_flow_not_found(client, logged_in_headers):
|
||||
response = client.get("/api/v1/build/nonexistent_id/vertices", headers=logged_in_headers)
|
||||
assert response.status_code == 500 # Or whatever status code you've set for invalid ID
|
||||
response = client.get(
|
||||
"/api/v1/build/nonexistent_id/vertices", headers=logged_in_headers
|
||||
)
|
||||
assert (
|
||||
response.status_code == 500
|
||||
) # Or whatever status code you've set for invalid ID
|
||||
|
||||
|
||||
def test_get_vertices(client, added_flow_with_prompt_and_history, logged_in_headers):
|
||||
flow_id = added_flow_with_prompt_and_history["id"]
|
||||
response = client.get(f"/api/v1/build/{flow_id}/vertices", headers=logged_in_headers)
|
||||
response = client.get(
|
||||
f"/api/v1/build/{flow_id}/vertices", headers=logged_in_headers
|
||||
)
|
||||
assert response.status_code == 200
|
||||
assert "ids" in response.json()
|
||||
# The response should contain the list in this order
|
||||
|
|
@ -436,13 +453,19 @@ def test_get_vertices(client, added_flow_with_prompt_and_history, logged_in_head
|
|||
|
||||
|
||||
def test_build_vertex_invalid_flow_id(client, logged_in_headers):
|
||||
response = client.post("/api/v1/build/nonexistent_id/vertices/vertex_id", headers=logged_in_headers)
|
||||
response = client.post(
|
||||
"/api/v1/build/nonexistent_id/vertices/vertex_id", headers=logged_in_headers
|
||||
)
|
||||
assert response.status_code == 500
|
||||
|
||||
|
||||
def test_build_vertex_invalid_vertex_id(client, added_flow_with_prompt_and_history, logged_in_headers):
|
||||
def test_build_vertex_invalid_vertex_id(
|
||||
client, added_flow_with_prompt_and_history, logged_in_headers
|
||||
):
|
||||
flow_id = added_flow_with_prompt_and_history["id"]
|
||||
response = client.post(f"/api/v1/build/{flow_id}/vertices/invalid_vertex_id", headers=logged_in_headers)
|
||||
response = client.post(
|
||||
f"/api/v1/build/{flow_id}/vertices/invalid_vertex_id", headers=logged_in_headers
|
||||
)
|
||||
assert response.status_code == 500
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
from fastapi.testclient import TestClient
|
||||
|
||||
from langflow.services.deps import get_settings_service
|
||||
|
||||
|
||||
|
|
@ -9,42 +10,3 @@ def test_prompts_settings(client: TestClient, logged_in_headers):
|
|||
json_response = response.json()
|
||||
prompts = json_response["prompts"]
|
||||
assert set(prompts.keys()) == set(settings_service.settings.PROMPTS)
|
||||
|
||||
|
||||
def test_prompt_template(client: TestClient, logged_in_headers):
|
||||
response = client.get("api/v1/all", headers=logged_in_headers)
|
||||
assert response.status_code == 200
|
||||
json_response = response.json()
|
||||
prompts = json_response["prompts"]
|
||||
|
||||
prompt = prompts["PromptTemplate"]
|
||||
template = prompt["template"]
|
||||
assert template["input_variables"] == {
|
||||
"required": True,
|
||||
"dynamic": True,
|
||||
"placeholder": "",
|
||||
"show": False,
|
||||
"multiline": False,
|
||||
"password": False,
|
||||
"name": "input_variables",
|
||||
"type": "str",
|
||||
"list": True,
|
||||
"advanced": False,
|
||||
"info": "",
|
||||
"fileTypes": [],
|
||||
}
|
||||
|
||||
assert template["template"] == {
|
||||
"required": True,
|
||||
"dynamic": True,
|
||||
"placeholder": "",
|
||||
"show": True,
|
||||
"multiline": True,
|
||||
"password": False,
|
||||
"name": "template",
|
||||
"type": "prompt",
|
||||
"list": False,
|
||||
"advanced": False,
|
||||
"info": "",
|
||||
"fileTypes": [],
|
||||
}
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue