docs: add chat io component examples (#7734)
* add-component-example * cleanup * Update docs/docs/Components/components-io.md * numbering
This commit is contained in:
parent
845f65b9aa
commit
7dad1485ed
2 changed files with 113 additions and 0 deletions
|
|
@ -3,6 +3,8 @@ title: Inputs and outputs
|
|||
slug: /components-io
|
||||
---
|
||||
|
||||
import Icon from "@site/src/components/icon";
|
||||
|
||||
# Input and output components in Langflow
|
||||
|
||||
Input and output components define where data enters and exits your flow.
|
||||
|
|
@ -129,5 +131,116 @@ The output does not appear in the **Playground**.
|
|||
|------|--------------|------|
|
||||
|text|Text|The resulting text message.|
|
||||
|
||||
## Chat components example flow
|
||||
|
||||
1. To use the **Chat Input** and **Chat Output** components in a flow, connect them to components that accept or send the [Message](/concepts-objects#message) type.
|
||||
|
||||
For this example, connect a **Chat Input** component to an **OpenAI** model component's **Input** port, and then connect the **OpenAI** model component's **Message** port to the **Chat Output** component.
|
||||
|
||||
2. In the **OpenAI** model component, in the **OpenAI API Key** field, add your **OpenAI API key**.
|
||||
|
||||
The flow looks like this:
|
||||
|
||||

|
||||
|
||||
3. To send a message to your flow, open the **Playground**, and then enter a message.
|
||||
The **OpenAI** model component responds.
|
||||
Optionally, in the **OpenAI** model component, enter a **System Message** to control the model's response.
|
||||
4. In the Langflow UI, click your flow name, and then click **Logs**.
|
||||
The **Logs** pane opens.
|
||||
Here, you can inspect your component logs.
|
||||

|
||||
|
||||
5. Your first message was sent by the **Chat Input** component to the **OpenAI** model component.
|
||||
Click **Outputs** to view the sent message:
|
||||
```text
|
||||
"messages": [
|
||||
{
|
||||
"message": "What's the recommended way to install Docker on Mac M1?",
|
||||
"sender": "User",
|
||||
"sender_name": "User",
|
||||
"session_id": "Session Apr 21, 17:37:04",
|
||||
"stream_url": null,
|
||||
"component_id": "ChatInput-4WKag",
|
||||
"files": [],
|
||||
"type": "text"
|
||||
}
|
||||
],
|
||||
```
|
||||
6. Your second message was sent by the **OpenAI** model component to the **Chat Output** component.
|
||||
This is the raw text output of the model's response.
|
||||
The **Chat Output** component accepts this text as input and presents it as a formatted message.
|
||||
Click **Outputs** to view the sent message:
|
||||
```text
|
||||
"outputs":
|
||||
"text_output":
|
||||
"message": "To install Docker on a Mac with an M1 chip, you should use Docker Desktop for Mac, which is optimized for Apple Silicon. Here’s a step-by-step guide to installing Docker on your M1 Mac:\n\n1.
|
||||
...
|
||||
"type": "text"
|
||||
```
|
||||
|
||||
:::tip
|
||||
Optionally, to view the outputs of each component in the flow, click <Icon name="TextSearch" aria-label="Inspect icon" />.
|
||||
:::
|
||||
|
||||
### Send chat messages with the API
|
||||
|
||||
The **Chat Input** component is often the entry point for passing messages to the Langflow API.
|
||||
To send the same example messages programmatically to your Langflow server, do the following:
|
||||
|
||||
1. To get your Langflow endpoint, click **Publish**, and then click **API access**.
|
||||
2. Copy the command from the **cURL** tab, and then paste it in your terminal.
|
||||
It looks similar to this:
|
||||
```text
|
||||
curl --request POST \
|
||||
--url 'http://127.0.0.1:7860/api/v1/run/51eed711-4530-4fdc-9bce-5db4351cc73a?stream=false' \
|
||||
--header 'Content-Type: application/json' \
|
||||
--data '{
|
||||
"input_value": "What's the recommended way to install Docker on Mac M1?",
|
||||
"output_type": "chat",
|
||||
"input_type": "chat"
|
||||
}'
|
||||
```
|
||||
3. Modify `input_value` so it contains the question, `What's the recommended way to install Docker on Mac M1?`.
|
||||
|
||||
Note the `output_type` and `input_type` parameters that are passed with the message. The `chat` type provides additional configuration options, and the messages appear in the **Playground**. The `text` type returns only text strings, and does not appear in the **Playground**.
|
||||
|
||||
4. Add a custom `session_id` to the message's `data` object.
|
||||
```text
|
||||
curl --request POST \
|
||||
--url 'http://127.0.0.1:7860/api/v1/run/51eed711-4530-4fdc-9bce-5db4351cc73a?stream=false' \
|
||||
--header 'Content-Type: application/json' \
|
||||
--data '{
|
||||
"input_value": "Whats the recommended way to install Docker on Mac M1",
|
||||
"session_id": "docker-question-on-m1",
|
||||
"output_type": "chat",
|
||||
"input_type": "chat"
|
||||
}'
|
||||
```
|
||||
The custom `session_id` value starts a new chat session between your client and the Langflow server, and can be useful in keeping conversations and AI context separate.
|
||||
|
||||
5. Send the POST request.
|
||||
Your request is answered.
|
||||
6. Navigate to the **Playground**.
|
||||
A new chat session called `docker-question-on-m1` has appeared, using your unique `session_id`.
|
||||
7. To modify additional parameters with **Tweaks** for your **Chat Input** and **Chat Output** components, click **Publish**, and then click **API access**.
|
||||
8. Click **Tweaks** to modify parameters in the component's `data` object.
|
||||
For example, disabling storing messages from the **Chat Input** component adds a **Tweak** to your command:
|
||||
```text
|
||||
curl --request POST \
|
||||
--url 'http://127.0.0.1:7860/api/v1/run/51eed711-4530-4fdc-9bce-5db4351cc73a?stream=false' \
|
||||
--header 'Content-Type: application/json' \
|
||||
--data '{
|
||||
"input_value": "Text to input to the flow",
|
||||
"output_type": "chat",
|
||||
"input_type": "chat",
|
||||
"tweaks": {
|
||||
"ChatInput-4WKag": {
|
||||
"should_store_message": false
|
||||
}
|
||||
}
|
||||
}'
|
||||
```
|
||||
|
||||
To confirm your command is using the tweak, navigate to the **Logs** pane and view the request from the **Chat Input** component.
|
||||
The value for `should_store_message` is `false`.
|
||||
|
|
|
|||
BIN
docs/static/img/component-chat-io.png
vendored
Normal file
BIN
docs/static/img/component-chat-io.png
vendored
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 956 KiB |
Loading…
Add table
Add a link
Reference in a new issue