diff --git a/docs/docs/Components/components-embedding-models.mdx b/docs/docs/Components/components-embedding-models.mdx index 49f49d435..f3c9525c6 100644 --- a/docs/docs/Components/components-embedding-models.mdx +++ b/docs/docs/Components/components-embedding-models.mdx @@ -8,9 +8,7 @@ import Icon from "@site/src/components/icon"; **Embedding Model** components in Langflow generate text embeddings using a specified Large Language Model (LLM). Langflow includes an **Embedding Model** core component that has built-in support for some LLMs. -Alternatively, you can use any [additional **Embedding Model** component](#additional-embedding-model-components) in place of the core **Embedding Model** component. - -The built-in LLMs are appropriate for most text-based embedding model use cases in Langflow. +Alternatively, you can use [additional embedding models](#additional-embedding-model-components) in place of the core **Embedding Model** component. ## Use Embedding Model components in a flow @@ -21,22 +19,16 @@ This flow loads a text file, splits the text into chunks, generates embeddings f ![A semantic search flow that uses Embedding Model, File, Split Text, Chroma DB, Chat Input, and Chat Output components](/img/component-embedding-models-add-chat.png) -:::tip -This example uses the **Embedding Model** core component. - -To use another model, you can replace the **Embedding Model** core component with any [additional **Embedding Model** component](#additional-embedding-model-components) in these steps. -However, your component might have different parameters than the **Embedding Model** core component. -::: - 1. Create a flow, add a **File** component, and then select a file containing text data, such as a PDF, that you can use to test the flow. 2. Add an **Embedding Model** component, and then provide a valid OpenAI API key. +You can enter component API keys directly or use Langflow global variables to reference your API keys. - By default, the **Embedding Model** component uses an OpenAI model. - If you want to use a different model, edit the **Model Name**, and **API Key** fields accordingly. - Or, see [Additional Embedding Model components](#additional-embedding-model-components) for other components that you can use in place of the **Embedding Model** core component. + :::tip + If your preferred embedding model provider or model isn't supported by the **Embedding Model** core component, you can use [additional embedding models](#additional-embedding-model-components) in place of the core component. - You can enter component API keys directly or use Langflow global variables to reference your API keys. + Search the **Components** menu for your preferred provider to find additional embedding models, such as the [**Hugging Face Embeddings Inference** component](/bundles-huggingface#hugging-face-embeddings-inference). + ::: 3. Add a [**Split Text** component](/components-processing#split-text) to your flow. This component splits text input into smaller chunks to be processed into embeddings. @@ -76,9 +68,9 @@ You can toggle parameters through the **Controls**, enable the **System Message** parameter, and then click **Close**. @@ -118,24 +119,27 @@ This is a specific data type that is only required by certain components, such a With this configuration, the **Language Model** component is meant to support an action completed by another component, rather than producing a text response for a standard chat-based interaction. For an example, the **Smart Function** component uses an LLM to create a function from natural language input. -## Additional Language Model components +## Additional language models -If your provider or model isn't supported by the **Language Model** core component, additional single-provider **Language Model** components are available in the [**Bundles**](/components-bundle-components) section of the **Components** menu. +If your provider or model isn't supported by the **Language Model** core component, additional provider-specific models are available in the [**Bundles**](/components-bundle-components) section of the **Components** menu. -You can use bundled components directly in your flows or you can connect them to other components that accept a [`LanguageModel`](/data-types#languagemodel) input, such as the **Language Model** and **Agent** components. +You can use these provider-specific components directly in your flows in the same place that you would use the **Language Model** core component. +Or, you can connect them to other components that accept a [`LanguageModel`](/data-types#languagemodel) input, such as the **Smart Function** and **Agent** components. -For example, to connect bundled components to the **Language Model** core component, do the following: +For example, to connect a provider-specific component to the **Agent** component, do the following: -1. In the **Language Model** component, set **Model Provider** to **Custom**. +1. In the **Components** menu, search for your preferred model provider, and then add the provider's LLM component to your flow. +The component may not have `model` in the name. +For example, Azure OpenAI LLMs are in the [**Azure OpenAI** component](/bundles-azure#azure-openai). - The field name changes to **Language Model** and the input port changes to a `LanguageModel` port. +2. Configure the LLM component as needed to connect to your preferred model. -2. Add a compatible bundled component to your flow, such as the [**Vertex AI** component for text generation](/bundles-vertexai). - -3. Change the bundled component's output type to `LanguageModel`. -To do this, click **Model Response** near the component's output port, and then select **Language Model**. +3. Change the LLM component's output type from **Model Response** to **Language Model**. +The output port changes to a `LanguageModel` port. For more information, see [Language Model output types](#language-model-output-types). -4. Connect the bundled component's output to the **Language Model** component's `LanguageModel` input port. +2. Add an **Agent** component to the flow, and then set **Model Provider** to **Custom**. +The **Model Provider** field changes to a **Language Model** field with a `LanguageModel` port. - The bundled component now provides the LLM configuration for the component that it is connected to, and you can continue building your flow as needed. \ No newline at end of file +4. Connect the LLM component's output to the **Agent** component's **Language Model** input. +The **Agent** component now inherits the LLM settings from the connected LLM component instead of using any of the built-in models. \ No newline at end of file diff --git a/docs/docs/Support/release-notes.mdx b/docs/docs/Support/release-notes.mdx index a23ffc08a..8faed6160 100644 --- a/docs/docs/Support/release-notes.mdx +++ b/docs/docs/Support/release-notes.mdx @@ -61,7 +61,7 @@ For all changes, see the [Changelog](https://github.com/langflow-ai/langflow/rel The [**Language Model** component](/components-models) and [**Embedding Model** component](/components-embedding-models) are now core components for your LLM and embeddings flows. They support multiple models and model providers, and allow you to experiment with different models without swapping out single-provider components. Find them in the **Components** menu in the **Models** category. - The single-provider components are still available for your flows in the **Components** menu in the [**Bundles**](/components-bundle-components) section, and you can connect them to the **Language Model** and **Embedding Model** components with the **Custom** provider option. + The single-provider components are still available for your flows in the **Components** menu in the [**Bundles**](/components-bundle-components) section, and you can use them to replace the **Language Model** and **Embedding Model** core components, or connect them to the **Agent** component with the **Custom** provider option. - MCP server one-click installation