docs: merge starter projects and sample flows into templates (#8652)

* move-pages-and-add-redirects

* move-vector-rag-in-list
This commit is contained in:
Mendon Kissling 2025-06-20 14:34:30 -04:00 committed by GitHub
commit d4fca1fe40
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
23 changed files with 39 additions and 36 deletions

View file

@ -116,7 +116,7 @@ A dictionary of all Langflow components.
Execute a specified flow by ID or name.
The flow is executed as a batch, but LLM responses can be streamed.
This example runs a [Basic Prompting](/starter-projects-basic-prompting) flow with a given `flow_id` and passes a JSON object as the input value.
This example runs a [Basic Prompting](/basic-prompting) flow with a given `flow_id` and passes a JSON object as the input value.
The parameters are passed in the request body. In this example, the values are the default values.

View file

@ -17,7 +17,7 @@ The agent then uses a connected LLM to reason through the problem to decide whic
## Use an agent in a flow
The [simple agent starter project](/starter-projects-simple-agent) uses an [agent component](#agent-component) connected to URL and Calculator tools to answer a user's questions. The OpenAI LLM acts as a brain for the agent to decide which tool to use. Tools are connected to agent components at the **Tools** port.
The [simple agent starter project](/simple-agent) uses an [agent component](#agent-component) connected to URL and Calculator tools to answer a user's questions. The OpenAI LLM acts as a brain for the agent to decide which tool to use. Tools are connected to agent components at the **Tools** port.
![Simple agent starter flow](/img/starter-flow-simple-agent.png)

View file

@ -280,7 +280,7 @@ To run an embeddings inference locally, see the [HuggingFace documentation](http
To connect the local Hugging Face model to the **Hugging Face embeddings inference** component and use it in a flow, follow these steps:
1. Create a [Vector store RAG flow](/starter-projects-vector-store-rag).
1. Create a [Vector store RAG flow](/vector-store-rag).
There are two embeddings models in this flow that you can replace with **Hugging Face** embeddings inference components.
2. Replace both **OpenAI** embeddings model components with **Hugging Face** model components.
3. Connect both **Hugging Face** components to the **Embeddings** ports of the **Astra DB vector store** components.

View file

@ -237,7 +237,7 @@ The Run Flow component can also be used as a tool when connected to an [Agent](/
When you select a flow, the component fetches the flow's graph structure and uses it to generate the inputs and outputs for the Run Flow component.
To use the Run Flow component as a tool, do the following:
1. Add the **Run Flow** component to the [Simple Agent](/starter-projects-simple-agent) flow.
1. Add the **Run Flow** component to the [Simple Agent](/simple-agent) flow.
2. In the **Flow Name** menu, select the sub-flow you want to run.
The appearance of the **Run Flow** component changes to reflect the inputs and outputs of the selected flow.
3. On the **Run Flow** component, enable **Tool Mode**.

View file

@ -17,7 +17,7 @@ Model components receive inputs and prompts for generating text, and the generat
The model output can also be sent to the **Language Model** port and on to a **Parse Data** component, where the output can be parsed into structured [Data](/concepts-objects) objects.
This example has the OpenAI model in a chatbot flow. For more information, see the [Basic prompting flow](/starter-projects-basic-prompting).
This example has the OpenAI model in a chatbot flow. For more information, see the [Basic prompting flow](/basic-prompting).
![](/img/starter-flow-basic-prompting.png)
@ -227,7 +227,7 @@ For more information, see the [Google Generative AI documentation](https://cloud
This component generates text using Groq's language models.
1. To use this component in a flow, connect it as a **Model** in a flow like the [Basic prompting flow](/starter-projects-basic-prompting), or select it as the **Model Provider** if you're using an **Agent** component.
1. To use this component in a flow, connect it as a **Model** in a flow like the [Basic prompting flow](/basic-prompting), or select it as the **Model Provider** if you're using an **Agent** component.
![Groq component in a basic prompting flow](/img/component-groq.png)
@ -274,9 +274,9 @@ This component sends requests to the Hugging Face API to generate text using the
The Hugging Face API is a hosted inference API for models hosted on Hugging Face, and requires a [Hugging Face API token](https://huggingface.co/docs/hub/security-tokens) to authenticate.
In this example based on the [Basic prompting flow](/starter-projects-basic-prompting), the **Hugging Face API** model component replaces the **Open AI** model. By selecting different hosted models, you can see how different models return different results.
In this example based on the [Basic prompting flow](/basic-prompting), the **Hugging Face API** model component replaces the **Open AI** model. By selecting different hosted models, you can see how different models return different results.
1. Create a [Basic prompting flow](/starter-projects-basic-prompting).
1. Create a [Basic prompting flow](/basic-prompting).
2. Replace the **OpenAI** model component with a **Hugging Face API** model component.

View file

@ -19,7 +19,7 @@ The agent then uses a connected LLM to reason through the problem to decide whic
Tools are typically connected to agent components at the **Tools** port.
The [simple agent starter project](/starter-projects-simple-agent) uses URL and Calculator tools connected to an [agent component](/components-agents#agent-component) to answer a user's questions. The OpenAI LLM acts as a brain for the agent to decide which tool to use.
The [simple agent starter project](/simple-agent) uses URL and Calculator tools connected to an [agent component](/components-agents#agent-component) to answer a user's questions. The OpenAI LLM acts as a brain for the agent to decide which tool to use.
![Simple agent starter flow](/img/starter-flow-simple-agent.png)

View file

@ -81,7 +81,7 @@ The **Astra DB Vector Store** component offers two methods for generating embedd
The embedding model selection is made when creating a new collection and cannot be changed later.
:::
For an example of using the **Astra DB Vector Store** component with an embedding model, see the [Vector Store RAG starter project](/starter-projects-vector-store-rag).
For an example of using the **Astra DB Vector Store** component with an embedding model, see the [Vector Store RAG starter project](/vector-store-rag).
For more information, see the [Astra DB Serverless documentation](https://docs.datastax.com/en/astra-db-serverless/databases/embedding-generation.html).

View file

@ -19,7 +19,7 @@ Your flow must have a [Chat input](/components-io#chat-input) component to inter
Chat with an agent in the **Playground**, and get more recent results by asking the agent to use tools.
1. Create a [Simple agent starter project](/starter-projects-simple-agent).
1. Create a [Simple agent starter project](/simple-agent).
2. Add your **OpenAI API key** credentials to the **Agent** component.
3. To start a chat session, click **Playground**.
4. To enable voice mode, click the <Icon name="Mic" aria-label="Microphone"/> icon.

View file

@ -16,7 +16,7 @@ For a sandbox example, see the [Langflow embedded chat CodeSandbox](https://code
The following example includes the minimum required inputs, called [props](https://react.dev/learn/passing-props-to-a-component) in React, for using the chat widget in your HTML code, which are `host_url` and `flow_id`.
The `host_url` value must be `HTTPS`, and may not include a `/` after the URL.
The `flow_id` value is found in your Langflow URL.
For a Langflow server running the [Basic prompting flow](/starter-projects-basic-prompting) at `https://c822-73-64-93-151.ngrok-free.app/flow/dcbed533-859f-4b99-b1f5-16fce884f28f`, your chat widget code is similar to the following:
For a Langflow server running the [Basic prompting flow](/basic-prompting) at `https://c822-73-64-93-151.ngrok-free.app/flow/dcbed533-859f-4b99-b1f5-16fce884f28f`, your chat widget code is similar to the following:
```html
<html>
<head>

View file

@ -24,7 +24,7 @@ For example:
* [Craft intelligent chatbots](/memory-chatbot)
* [Build document analysis systems](/document-qa)
* [Generate compelling content](/blog-writer)
* [Orchestrate multi-agent applications](/starter-projects-simple-agent)
* [Orchestrate multi-agent applications](/simple-agent)
* [Create agents with Langflow](/agents-overview)
* [Use Langflow as an MCP server](/mcp-server)
* [Use Langflow as an MCP client](/mcp-client)

View file

@ -67,7 +67,7 @@ For more information, see the [Arize documentation](https://docs.arize.com/phoen
## Run a flow and view metrics in Arize
1. In Langflow, select the [Simple agent](/starter-projects-simple-agent) starter project.
1. In Langflow, select the [Simple agent](/simple-agent) starter project.
2. In the **Agent** component's **OpenAI API Key** field, paste your **OpenAI API key**.
3. Click **Playground**.
Ask your Agent some questions to generate traffic.

View file

@ -17,7 +17,7 @@ Use the [MCP connection component](/mcp-client) to connect Langflow to a [Datast
4. Get your database's **Astra DB API endpoint** and an **Astra DB application token** with the Database Administrator role. For more information, see [Generate an application token for a database](https://docs.datastax.com/en/astra-db-serverless/administration/manage-application-tokens.html#database-token).
5. Create a [Simple agent starter project](/starter-projects-simple-agent) if you want to follow along with this guide. Otherwise, you can use an existing flow or create a new, blank flow.
5. Create a [Simple agent starter project](/simple-agent) if you want to follow along with this guide. Otherwise, you can use an existing flow or create a new, blank flow.
6. Remove the **URL** tool, and then replace it with an [MCP connection component](/mcp-client).
The flow should look like this:

View file

@ -1,6 +1,6 @@
---
title: Basic prompting
slug: /starter-projects-basic-prompting
slug: /basic-prompting
---
import Icon from "@site/src/components/icon";

View file

@ -9,7 +9,7 @@ import Icon from "@site/src/components/icon";
The **Chat memory** component is also known as the **Message history** component.
:::
This flow extends the [basic prompting flow](/starter-projects-basic-prompting) with a **Message history** component that stores up to 100 previous chat messages and uses them to provide context for the current conversation.
This flow extends the [basic prompting flow](/basic-prompting) with a **Message history** component that stores up to 100 previous chat messages and uses them to provide context for the current conversation.
## Prerequisites

View file

@ -1,6 +1,6 @@
---
title: Simple agent
slug: /starter-projects-simple-agent
slug: /simple-agent
---
Build a **Simple Agent** flow for an agentic application using the [Tool-calling agent](/agents-tool-calling-agent-component) component.

View file

@ -1,6 +1,6 @@
---
title: Vector store RAG
slug: /starter-projects-vector-store-rag
slug: /vector-store-rag
---
import Icon from "@site/src/components/icon";

View file

@ -200,9 +200,10 @@ const config = {
from: ["/starter-projects-document-qa", "/tutorials-document-qa"],
},
{
to: "/starter-projects-simple-agent",
to: "/simple-agent",
from: [
"/math-agent",
"/starter-projects-simple-agent",
"/starter-projects-math-agent",
"/tutorials-math-agent",
],
@ -268,6 +269,14 @@ const config = {
"/deployment-kubernetes",
],
},
{
to: "/basic-prompting",
from: "/starter-projects-basic-prompting",
},
{
to: "/vector-store-rag",
from: "/starter-projects-vector-store-rag",
},
// add more redirects like this
// {
// to: '/docs/anotherpage',

View file

@ -11,23 +11,17 @@ module.exports = {
},
{
type: "category",
label: "Starter projects",
label: "Templates",
items: [
'Starter-Projects/starter-projects-basic-prompting',
'Starter-Projects/starter-projects-vector-store-rag',
'Starter-Projects/starter-projects-simple-agent',
],
},
{
type: "category",
label: "Sample flows",
items: [
'Sample-Flows/blog-writer',
'Sample-Flows/document-qa',
'Sample-Flows/memory-chatbot',
'Sample-Flows/financial-report-parser',
'Sample-Flows/sequential-agent',
'Sample-Flows/travel-planning-agent',
'Templates/basic-prompting',
'Templates/simple-agent',
'Templates/blog-writer',
'Templates/document-qa',
'Templates/memory-chatbot',
'Templates/vector-store-rag',
'Templates/financial-report-parser',
'Templates/sequential-agent',
'Templates/travel-planning-agent',
],
},
{