diff --git a/docs/docs/Agents/agent-tool-calling-agent-component.md b/docs/docs/Agents/agent-tool-calling-agent-component.md index cc8dcf289..bbbe2dd83 100644 --- a/docs/docs/Agents/agent-tool-calling-agent-component.md +++ b/docs/docs/Agents/agent-tool-calling-agent-component.md @@ -54,7 +54,7 @@ Point **API Request** to an online rules document, tell your agent `You are a fu * You need to learn a new software language quickly. Point **API Request** to some docs, tell your agent `You are a knowledgeable software developer who uses the tools at your disposal`, and start learning. -See what problems you can solve with this flow. As your problem becomes more specialized, add a tool. For example, the [math agent project](/math-agent) adds a Python REPL component to solve math problems that are too challenging for the calculator. +See what problems you can solve with this flow. As your problem becomes more specialized, add more tools. For example, add a Python REPL component to solve math problems that are too challenging for the calculator. ### Edit a tool's metadata diff --git a/docs/docs/Concepts/concepts-voice-mode.md b/docs/docs/Concepts/concepts-voice-mode.md index e358d382d..00fd3ca9f 100644 --- a/docs/docs/Concepts/concepts-voice-mode.md +++ b/docs/docs/Concepts/concepts-voice-mode.md @@ -13,7 +13,7 @@ Your flow must have a [Chat input](/components-io#chat-input) component to inter ## Prerequisite -- [OpenAI API key created](https://platform.openai.com/) +- [An OpenAI API key](https://platform.openai.com/) ## Use voice mode in the Langflow Playground diff --git a/docs/docs/Develop/session-id.md b/docs/docs/Develop/session-id.md index 99aa0569d..391399f65 100644 --- a/docs/docs/Develop/session-id.md +++ b/docs/docs/Develop/session-id.md @@ -33,7 +33,8 @@ The `my_custom_session_value` value is used in components that accept it, and th ## Retrieval of messages from memory by session ID -Add a [Message store](/components-helpers#message-store) component to a flow to access the default `langflow.db` database. The component accepts `sessionID` as a filter parameter, and uses the session ID value from upstream automatically to retrieve message history by session ID from storage. +To retrieve messages from local Langflow memory, add a [Message history](/components-helpers#message-history) component to your flow. +The component accepts `sessionID` as a filter parameter, and uses the session ID value from upstream automatically to retrieve message history by session ID from storage. Messages can be retrieved by `session_id` from the `/monitor` endpoint in the API. For more information, see the [API examples](https://docs.langflow.org/api-reference-api-examples#get-messages). diff --git a/docs/docs/Get-Started/get-started-quickstart.md b/docs/docs/Get-Started/get-started-quickstart.md index fb8392d38..d70f40f0c 100644 --- a/docs/docs/Get-Started/get-started-quickstart.md +++ b/docs/docs/Get-Started/get-started-quickstart.md @@ -9,17 +9,18 @@ Get to know Langflow by building an OpenAI-powered chatbot application. After yo ## Prerequisites -* [An OpenAI API key](https://platform.openai.com/) -* [An Astra DB vector database](https://docs.datastax.com/en/astra-db-serverless/get-started/quickstart.html) with: - * An Astra DB application token scoped to read and write to the database - * A collection created in [Astra](https://docs.datastax.com/en/astra-db-serverless/databases/manage-collections.html#create-collection) or a new collection created in the **Astra DB** component +- [A running Langflow instance](/get-started-installation) +- [An OpenAI API key](https://platform.openai.com/) +- [An Astra DB vector database](https://docs.datastax.com/en/astra-db-serverless/get-started/quickstart.html) with: + - An Astra DB application token scoped to read and write to the database + - A collection created in [Astra](https://docs.datastax.com/en/astra-db-serverless/databases/manage-collections.html#create-collection) or a new collection created in the **Astra DB** component ## Open Langflow and start a new project 1. From the Langflow dashboard, click **New Flow**, and then select **Blank Flow**. A blank workspace opens where you can build your flow. :::tip -If you don't want to create a blank flow, click **New Flow**, and then select **Basic Prompting** for a pre-built flow. +If you want a pre-built flow, click **New Flow**, and then select **Basic Prompting**. Continue to [Run the basic prompting flow](#run-basic-prompting-flow). ::: @@ -68,8 +69,8 @@ Add your OpenAI API key to the OpenAI model component, and add a prompt to the P 1. Add your credentials to the OpenAI component. The fastest way to complete these fields is with Langflow’s [Global Variables](/configuration-global-variables). - 1. In the OpenAI component’s OpenAI API Key field, click the **Globe** button, and then click **Add New Variable**. - Alternatively, click your username in the top right corner, and then click **Settings**, **Global Variables**, and then **Add New**. + 1. In the OpenAI component’s OpenAI API Key field, click the **Globe** button, and then click **Add New Variable**. + Alternatively, click your user icon in the top right corner, and then click **Settings**, **Global Variables**, and then **Add New**. 2. Name your variable. Paste your OpenAI API key (sk-…​) in the Value field. 3. In the **Apply To Fields** field, select the OpenAI API Key field to apply this variable to all OpenAI Embeddings components. @@ -122,8 +123,8 @@ The [Astra DB vector store](/components-vector-stores#astra-db-vector-store) com The [File](/components-data#file) component loads files from your local machine. 4. Click **Processing**, select the **Split Text** component, and then drag it to the canvas. The [Split Text](/components-processing#split-text) component splits the loaded text into smaller chunks. -5. Click **Processing**, select the **Parse Data** component, and then drag it to the canvas. -The [Data to Message](/components-processing#data-to-message) component converts the data from the **Astra DB** component into plain text. +5. Click **Processing**, select the **Parser** component, and then drag it to the canvas. +The [Parser](/components-processing#parser) component converts the data from the **Astra DB** component into plain text. 6. Click **Embeddings**, select the **OpenAI Embeddings** component, and then drag it to the canvas. The [OpenAI Embeddings](/components-embedding-models#openai-embeddings) component generates embeddings for the user's input, which are compared to the vector data in the database. 7. Connect the new components into the existing flow, so your flow looks like this: diff --git a/docs/docs/Sample-Flows/blog-writer.md b/docs/docs/Sample-Flows/blog-writer.md index 1f3029870..354b78444 100644 --- a/docs/docs/Sample-Flows/blog-writer.md +++ b/docs/docs/Sample-Flows/blog-writer.md @@ -5,16 +5,14 @@ slug: /blog-writer Build a Blog Writer flow for a one-shot application using OpenAI. -This flow extends the Basic Prompting flow with the **URL** and **Parse data** components that fetch content from multiple URLs and convert the loaded data into plain text. - -OpenAI uses this loaded data to generate a blog post, as instructed by the **Text input** component. +This flow extends the Basic Prompting flow with the **URL** and **Parser** components that fetch content from multiple URLs and convert the loaded data into plain text. +OpenAI uses this loaded data to generate a blog post, as instructed by the **Text Input** and **Prompt** components. ## Prerequisites -- [Langflow installed and running](/get-started-installation) -- [OpenAI API key created](https://platform.openai.com/) - +- [A running Langflow instance](/get-started-installation) +- [An OpenAI API key](https://platform.openai.com/) ## Create the blog writer flow @@ -25,10 +23,10 @@ OpenAI uses this loaded data to generate a blog post, as instructed by the **Tex ![](/img/starter-flow-blog-writer.png) -This flow creates a one-shot article generator with **Prompt**, **OpenAI**, and **Chat Output** components, augmented with reference content and instructions from the **URL** and **Text Input** components. +This flow creates a blog article generator with **Prompt**, **OpenAI**, and **Chat Output** components, augmented with reference content and instructions from the **URL** and **Text Input** components. The **URL** component extracts raw text and metadata from one or more web links. -The **Parse Data** component converts the data coming from the **URL** component into plain text to feed the prompt. +The **Parser** component converts the data coming from the **URL** component into plain text to feed the prompt. To examine the flow's prompt, click the **Template** field of the **Prompt** component. @@ -44,7 +42,7 @@ Reference 1: Blog: ``` -The `{instructions}` value is received from the **Text input** component, and one or more `{references}` are received from a list of URLs parsed from the **URL** component. +The `{instructions}` value is received from the **Text Input** component, and one or more `{references}` are received from a list of URLs parsed from the **URL** component. ### Run the blog writer flow diff --git a/docs/docs/Sample-Flows/document-qa.md b/docs/docs/Sample-Flows/document-qa.md index 1332ea690..b0a7a8ddc 100644 --- a/docs/docs/Sample-Flows/document-qa.md +++ b/docs/docs/Sample-Flows/document-qa.md @@ -7,8 +7,8 @@ Build a question-and-answer chatbot with a document loaded from local memory. ## Prerequisites -- [Langflow installed and running](/get-started-installation) -- [OpenAI API key created](https://platform.openai.com/) +- [A running Langflow instance](/get-started-installation) +- [An OpenAI API key](https://platform.openai.com/) ## Create the document QA flow @@ -18,7 +18,7 @@ Build a question-and-answer chatbot with a document loaded from local memory. ![](/img/starter-flow-document-qa.png) -This flow is composed of a chatbot with the **Chat Input**, **Prompt**, **OpenAI**, and **Chat Output** components, but also incorporates a **File** component, which loads a file from your local machine. **Parse Data** is used to convert the data from **File** into the **Prompt** component as `{Document}`. +This flow is composed of a chatbot with the **Chat Input**, **Prompt**, **OpenAI**, and **Chat Output** components, but also incorporates a **File** component, which loads a file from your local machine. The **Parser** component converts the data from the **File** component into the **Prompt** component as `{Document}`. The **Prompt** component is instructed to answer questions based on the contents of `{Document}`. This gives the **OpenAI** component context it would not otherwise have access to. diff --git a/docs/docs/Sample-Flows/math-agent.md b/docs/docs/Sample-Flows/math-agent.md deleted file mode 100644 index ebc436518..000000000 --- a/docs/docs/Sample-Flows/math-agent.md +++ /dev/null @@ -1,54 +0,0 @@ ---- -title: Math agent -slug: /math-agent ---- - -import Icon from "@site/src/components/icon"; - -Build a **Math Agent** flow for an agentic application using the **Tool-calling agent** component. - -In this flow, the **Tool-calling agent** reasons using an **Open AI** LLM to solve math problems. -It selects the **Calculator** tool for simpler math and the **Python REPL** tool (with the Python `math` library) for more complex problems. - -## Prerequisites - -To use this flow, you need an OpenAI API key. - -## Open Langflow and start a new flow - -Click **New Flow**, and then select the **Math Agent** flow. - -This opens a starter flow with the necessary components to run an agentic application using the Tool-calling agent. - -## Math Agent flow - -![](/img/starter-flow-simple-agent-repl.png) - -The **Math Agent** flow consists of these components: - -* The **Tool calling agent** component uses the connected LLM to reason through the user's input and select among the connected tools to complete its task. -* The **Python REPL tool** component executes Python code in a REPL (Read-Evaluate-Print Loop) interpreter. -* The **Calculator** component performs basic arithmetic operations. -* The **Chat Input** component accepts user input to the chat. -* The **Chat Output** component prints the flow's output to the chat. - -## Run the Math Agent flow - -1. Add your credentials to the **Agent** component. -2. Click **Playground** to start a chat session. -3. Enter a simple math problem, like `2 + 2`, and then make sure the bot responds with the correct answer. -4. To confirm the REPL interpreter is working, prompt the `math` library directly with `math.sqrt(4)` and see if the bot responds with `4`. -5. The agent will also reason through more complex word problems. For example, prompt the agent with the following math problem: - -```text -The equation 24x2+25x−47ax−2=−8x−3−53ax−2 is true for all values of x≠2a, where a is a constant. -What is the value of a? -A) -16 -B) -3 -C) 3 -D) 16 -``` - -The agent should respond with `B`. - -Now that your query has completed the journey from **Chat input** to **Chat output**, you have completed the **Math Agent** flow. diff --git a/docs/docs/Sample-Flows/memory-chatbot.md b/docs/docs/Sample-Flows/memory-chatbot.md index e39b1c790..45adbf302 100644 --- a/docs/docs/Sample-Flows/memory-chatbot.md +++ b/docs/docs/Sample-Flows/memory-chatbot.md @@ -9,8 +9,8 @@ This flow extends the [basic prompting flow](/starter-projects-basic-prompting) ## Prerequisites -- [Langflow installed and running](/get-started-installation) -- [OpenAI API key created](https://platform.openai.com/) +- [A running Langflow instance](/get-started-installation) +- [An OpenAI API key](https://platform.openai.com/) ## Create the memory chatbot flow @@ -37,14 +37,12 @@ History: ``` The `{memory}` code in the prompt creates a new input port in the component called **memory**. -The **Chat Memory** component is connected to this port to store chat messages from the **Playground**. - -This gives the **OpenAI** component a memory of previous chat messages. +The **Chat Memory** component is connected to this port to store chat messages from the **Playground**, and provide the **OpenAI** component with a memory of previous chat messages. ## Run the memory chatbot flow 1. Open the **Playground**. -2. Type multiple questions. For example, try entering this conversation: +2. Enter multiple questions. For example, try entering this conversation: ```plain Hi, my name is Luca. diff --git a/docs/docs/Sample-Flows/sequential-agent.md b/docs/docs/Sample-Flows/sequential-agent.md index e9695cc5e..44adff20d 100644 --- a/docs/docs/Sample-Flows/sequential-agent.md +++ b/docs/docs/Sample-Flows/sequential-agent.md @@ -10,6 +10,8 @@ Each agent has an LLM model and a unique set of tools at its disposal, with **Pr Each successive agent in the flow builds on the work of the previous agent, creating a chain of reasoning for solving complex problems. ## Prerequisites + +- [A running Langflow instance](/get-started-installation) - [An OpenAI API key](https://platform.openai.com/) - [A Tavily AI API key](https://www.tavily.com/) diff --git a/docs/docs/Sample-Flows/travel-planning-agent.md b/docs/docs/Sample-Flows/travel-planning-agent.md index c6361b2ed..76c49ca9f 100644 --- a/docs/docs/Sample-Flows/travel-planning-agent.md +++ b/docs/docs/Sample-Flows/travel-planning-agent.md @@ -15,7 +15,9 @@ All agents have access to the **Search API** and **URL Content Fetcher** compone ## Prerequisites -To use this flow, you need an [OpenAI API key](https://platform.openai.com/) and a [Search API key](https://www.searchapi.io/). +- [A running Langflow instance](/get-started-installation) +- [An OpenAI API key](https://platform.openai.com/) +- [A Search API key](https://www.searchapi.io/) ## Open Langflow and start a new flow diff --git a/docs/docs/Starter-Projects/starter-projects-basic-prompting.md b/docs/docs/Starter-Projects/starter-projects-basic-prompting.md index 154f42d82..bc204ac09 100644 --- a/docs/docs/Starter-Projects/starter-projects-basic-prompting.md +++ b/docs/docs/Starter-Projects/starter-projects-basic-prompting.md @@ -14,8 +14,8 @@ This article demonstrates how to use Langflow's prompt tools to issue basic prom ## Prerequisites -- [Langflow installed and running](/get-started-installation) -- [OpenAI API key created](https://platform.openai.com/) +- [A running Langflow instance](/get-started-installation) +- [An OpenAI API key](https://platform.openai.com/) ## Create the basic prompting flow diff --git a/docs/docs/Starter-Projects/starter-projects-simple-agent.md b/docs/docs/Starter-Projects/starter-projects-simple-agent.md index 79a1fc804..c972b1546 100644 --- a/docs/docs/Starter-Projects/starter-projects-simple-agent.md +++ b/docs/docs/Starter-Projects/starter-projects-simple-agent.md @@ -3,7 +3,7 @@ title: Simple agent slug: /starter-projects-simple-agent --- -Build a **Simple Agent** flow for an agentic application using the **Tool-calling agent** component. +Build a **Simple Agent** flow for an agentic application using the [Tool-calling agent](/agents-tool-calling-agent-component) component. An **agent** uses an LLM as its "brain" to select among the connected tools and complete its tasks. @@ -12,7 +12,8 @@ The agent selects the **Calculator** tool for simple math problems and the **URL ## Prerequisites -To use this flow, you need an OpenAI API key. +- [A running Langflow instance](/get-started-installation) +- [An OpenAI API key](https://platform.openai.com/) ## Open Langflow and start a new flow @@ -47,7 +48,7 @@ as_dataframe: Load and retrieve data in a structured format (dataframe) from spe get_current_date: Returns the current date and time in a selected timezone. ``` 4. Ask the agent a question. For example, ask it to create a tabletop character using your favorite rules set. -The agent will tell you when it's using the `URL-fetch_content_text` tool to search for rules information, and when it's using `CalculatorComponent-evaluate_expression` to generate attributes with dice rolls. +The agent tells you when it's using the `URL-fetch_content_text` tool to search for rules information, and when it's using `CalculatorComponent-evaluate_expression` to generate attributes with dice rolls. The final output should be similar to this: ```text diff --git a/docs/docs/Starter-Projects/starter-projects-vector-store-rag.md b/docs/docs/Starter-Projects/starter-projects-vector-store-rag.md index a833bed06..6d2835a72 100644 --- a/docs/docs/Starter-Projects/starter-projects-vector-store-rag.md +++ b/docs/docs/Starter-Projects/starter-projects-vector-store-rag.md @@ -19,10 +19,11 @@ We've chosen [Astra DB](https://astra.datastax.com/signup?utm_source=langflow-p ## Prerequisites -* [An OpenAI API key](https://platform.openai.com/) -* [An Astra DB vector database](https://docs.datastax.com/en/astra-db-serverless/get-started/quickstart.html) with the following: - * An Astra DB application token scoped to read and write to the database - * A collection created in [Astra](https://docs.datastax.com/en/astra-db-serverless/databases/manage-collections.html#create-collection) or a new collection created in the **Astra DB** component +- [A running Langflow instance](/get-started-installation) +- [An OpenAI API key](https://platform.openai.com/) +- [An Astra DB vector database](https://docs.datastax.com/en/astra-db-serverless/get-started/quickstart.html) with the following: + - An Astra DB application token scoped to read and write to the database + - A collection created in [Astra](https://docs.datastax.com/en/astra-db-serverless/databases/manage-collections.html#create-collection) or a new collection created in the **Astra DB** component ## Open Langflow and start a new project @@ -46,7 +47,7 @@ The **Retriever Flow** (top of the screen) embeds the user's queries into vecto - **Chat Input** receives user input from the **Playground**. - **OpenAI Embeddings** converts the user query into vector form. - **Astra DB** performs similarity search using the query vector. -- **Parse Data** processes the retrieved chunks. +- **Parser** processes the retrieved chunks. - **Prompt** combines the user query with relevant context. - **OpenAI** generates the response using the prompt. - **Chat Output** returns the response to the **Playground**. diff --git a/docs/docusaurus.config.js b/docs/docusaurus.config.js index ec9c9d38a..03651d09a 100644 --- a/docs/docusaurus.config.js +++ b/docs/docusaurus.config.js @@ -172,8 +172,9 @@ const config = { ], }, { - to: "/math-agent", + to: "/starter-projects-simple-agent", from: [ + "/math-agent", "/starter-projects-math-agent", "/tutorials-math-agent" ], diff --git a/docs/sidebars.js b/docs/sidebars.js index d5f4c906c..5f87f1512 100644 --- a/docs/sidebars.js +++ b/docs/sidebars.js @@ -25,7 +25,6 @@ module.exports = { 'Sample-Flows/blog-writer', 'Sample-Flows/document-qa', 'Sample-Flows/memory-chatbot', - 'Sample-Flows/math-agent', 'Sample-Flows/sequential-agent', 'Sample-Flows/travel-planning-agent', ], diff --git a/docs/static/img/quickstart-add-document-ingestion.png b/docs/static/img/quickstart-add-document-ingestion.png index 9bda4458d..b30cd8be8 100644 Binary files a/docs/static/img/quickstart-add-document-ingestion.png and b/docs/static/img/quickstart-add-document-ingestion.png differ diff --git a/docs/static/img/starter-flow-blog-writer.png b/docs/static/img/starter-flow-blog-writer.png index ef2efba8c..792601f25 100644 Binary files a/docs/static/img/starter-flow-blog-writer.png and b/docs/static/img/starter-flow-blog-writer.png differ diff --git a/docs/static/img/starter-flow-document-qa.png b/docs/static/img/starter-flow-document-qa.png index 6768579c2..5f60bda26 100644 Binary files a/docs/static/img/starter-flow-document-qa.png and b/docs/static/img/starter-flow-document-qa.png differ diff --git a/docs/static/img/starter-flow-unstructured-qa.png b/docs/static/img/starter-flow-unstructured-qa.png index 631038cdf..59af21d05 100644 Binary files a/docs/static/img/starter-flow-unstructured-qa.png and b/docs/static/img/starter-flow-unstructured-qa.png differ diff --git a/docs/static/img/starter-flow-vector-rag.png b/docs/static/img/starter-flow-vector-rag.png index d7830a641..5c15e26c2 100644 Binary files a/docs/static/img/starter-flow-vector-rag.png and b/docs/static/img/starter-flow-vector-rag.png differ diff --git a/docs/static/img/url-component.png b/docs/static/img/url-component.png index a0e9733cc..cde4838cb 100644 Binary files a/docs/static/img/url-component.png and b/docs/static/img/url-component.png differ diff --git a/docs/static/img/vector-store-retrieval.png b/docs/static/img/vector-store-retrieval.png index 132ba25a0..dabfce541 100644 Binary files a/docs/static/img/vector-store-retrieval.png and b/docs/static/img/vector-store-retrieval.png differ