docs: Revise some content related to flow triggers, including the webhook trigger process and some /run endpoint examples. (#8880)
* typos and cosmetic edits for some api content * webhook trigger rewrite * build errors and coderabbit * Update docs/docs/API-Reference/api-reference-api-examples.md * nitpicks * note for 1.5 * peer review
This commit is contained in:
parent
746a3b6264
commit
343bdd0632
6 changed files with 178 additions and 91 deletions
|
|
@ -92,11 +92,9 @@ curl -X GET \
|
|||
| Header | Info | Example |
|
||||
|--------|------|---------|
|
||||
| Content-Type | Required. Specifies the JSON format. | "application/json" |
|
||||
| accept | Required. Specifies the response format. | "application/json" |
|
||||
| accept | Optional. Specifies the response format. | "application/json" |
|
||||
| x-api-key | Optional. Required only if authentication is enabled. | "sk-..." |
|
||||
|
||||
The `/build/$FLOW_ID/flow` endpoint accepts the following parameters in its request body:
|
||||
|
||||
## Build parameters
|
||||
|
||||
| Parameter | Type | Description |
|
||||
|
|
|
|||
|
|
@ -84,7 +84,6 @@ The API returns the image file path in the format `"file_path":"<YOUR-FLOW-ID>/<
|
|||
}
|
||||
```
|
||||
|
||||
<!-- TODO: What link goes here? -->
|
||||
2. Post the image file to the **Chat Input** component of a **Basic prompting** flow.
|
||||
Pass the file path value as an input in the **Tweaks** section of the curl call to Langflow.
|
||||
Component `id` values can be found in [Langflow JSON files](/concepts-flows#langflow-json-file-contents).
|
||||
|
|
|
|||
|
|
@ -6,26 +6,21 @@ slug: /api-flows-run
|
|||
import Tabs from '@theme/Tabs';
|
||||
import TabItem from '@theme/TabItem';
|
||||
|
||||
Use the `/run` amd `/webhook` endpoints to run flows.
|
||||
Use the `/run` and `/webhook` endpoints to run flows.
|
||||
|
||||
To create, read, update, and delete flows, see [Flow management endpoints](/api-flows).
|
||||
|
||||
## Run flow
|
||||
|
||||
Execute a specified flow by ID or name.
|
||||
The flow is executed as a batch, but LLM responses can be streamed.
|
||||
Flow IDs can be found on the code snippets on the [**API access** pane](/concepts-publish#api-access) or in a flow's URL.
|
||||
|
||||
This example runs a [Basic Prompting](/basic-prompting) flow with a given flow ID and passes a JSON object as the input value.
|
||||
Flow IDs can be found on the [Publish pane](/concepts-publish) or in a flow's URL.
|
||||
|
||||
The parameters are passed in the request body. In this example, the values are the default values.
|
||||
|
||||
<Tabs>
|
||||
<TabItem value="curl" label="curl" default>
|
||||
The following example runs a [Basic Prompting](/basic-prompting) flow with flow parameters passed in the request body.
|
||||
This flow requires a chat input string (`input_value`), and uses default values for all other parameters.
|
||||
|
||||
```bash
|
||||
curl -X POST \
|
||||
"$LANGFLOW_URL/api/v1/run/$FLOW_ID" \
|
||||
"$LANGFLOW_SERVER_URL/api/v1/run/$FLOW_ID" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"input_value": "Tell me about something interesting!",
|
||||
|
|
@ -37,10 +32,14 @@ curl -X POST \
|
|||
}'
|
||||
```
|
||||
|
||||
</TabItem>
|
||||
<TabItem value="result" label="Result">
|
||||
The response from `/v1/run/$FLOW_ID` includes metadata, inputs, and outputs for the run.
|
||||
|
||||
```text
|
||||
<details>
|
||||
<summary>Result</summary>
|
||||
|
||||
The following example illustrates a response from a Basic Prompting flow:
|
||||
|
||||
```json
|
||||
{
|
||||
"session_id": "chat-123",
|
||||
"outputs": [{
|
||||
|
|
@ -71,20 +70,20 @@ curl -X POST \
|
|||
}]
|
||||
}
|
||||
```
|
||||
</details>
|
||||
|
||||
</TabItem>
|
||||
</Tabs>
|
||||
If you are parsing the response in an application, you most likely need to extract the relevant content from the response, rather than pass the entire response back to the user.
|
||||
For an example of a script that extracts data from a Langflow API response, see the [Quickstart](/get-started-quickstart).
|
||||
|
||||
### Stream LLM token responses
|
||||
|
||||
To stream LLM token responses, append the `?stream=true` query parameter to the request. LLM chat responses are streamed back as `token` events until the `end` event closes the connection.
|
||||
With `/v1/run/$FLOW_ID`, the flow is executed as a batch with optional LLM token response streaming.
|
||||
|
||||
<Tabs>
|
||||
<TabItem value="curl" label="curl" default>
|
||||
To stream LLM token responses, append the `?stream=true` query parameter to the request:
|
||||
|
||||
```bash
|
||||
curl -X POST \
|
||||
"$LANGFLOW_URL/api/v1/run/$FLOW_ID?stream=true" \
|
||||
"$LANGFLOW_SERVER_URL/api/v1/run/$FLOW_ID?stream=true" \
|
||||
-H "accept: application/json" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
|
|
@ -93,8 +92,12 @@ curl -X POST \
|
|||
}'
|
||||
```
|
||||
|
||||
</TabItem>
|
||||
<TabItem value="result" label="Result">
|
||||
LLM chat responses are streamed back as `token` events, culminating in a final `end` event that closes the connection.
|
||||
|
||||
<details>
|
||||
<summary>Result</summary>
|
||||
|
||||
The following example is truncated to illustrate a series of `token` events as well as the final `end` event that closes the LLM's token streaming response:
|
||||
|
||||
```text
|
||||
{"event": "add_message", "data": {"timestamp": "2025-03-03T17:20:18", "sender": "User", "sender_name": "User", "session_id": "chat-123", "text": "Tell me about something interesting!", "files": [], "error": false, "edit": false, "properties": {"text_color": "", "background_color": "", "edited": false, "source": {"id": null, "display_name": null, "source": null}, "icon": "", "allow_markdown": false, "positive_feedback": null, "state": "complete", "targets": []}, "category": "message", "content_blocks": [], "id": "0103a21b-ebf7-4c02-9d72-017fb297f812", "flow_id": "d2bbd92b-187e-4c84-b2d4-5df365704201"}}
|
||||
|
|
@ -117,22 +120,20 @@ curl -X POST \
|
|||
|
||||
{"event": "end", "data": {"result": {"session_id": "chat-123", "message": "Sure! Have you ever heard of the phenomenon known as \"bioluminescence\"?..."}}}
|
||||
```
|
||||
|
||||
</TabItem>
|
||||
</Tabs>
|
||||
|
||||
This result is abbreviated, but illustrates where the `end` event completes the LLM's token streaming response.
|
||||
</details>
|
||||
|
||||
### Run endpoint headers
|
||||
|
||||
| Header | Info | Example |
|
||||
|--------|------|---------|
|
||||
| Content-Type | Required. Specifies the JSON format. | "application/json" |
|
||||
| accept | Required. Specifies the response format. | "application/json" |
|
||||
| accept | Optional. Specifies the response format. | "application/json" |
|
||||
| x-api-key | Optional. Required only if authentication is enabled. | "sk-..." |
|
||||
|
||||
### Run endpoint parameters
|
||||
|
||||
<!-- TODO: Can there be other parameters depending on the components in the flow? -->
|
||||
|
||||
| Parameter | Type | Info |
|
||||
|-----------|------|------|
|
||||
| flow_id | UUID/string | Required. Part of URL: `/run/$FLOW_ID` |
|
||||
|
|
@ -148,7 +149,7 @@ This result is abbreviated, but illustrates where the `end` event completes the
|
|||
|
||||
```bash
|
||||
curl -X POST \
|
||||
"http://$LANGFLOW_URL/api/v1/run/$FLOW_ID?stream=true" \
|
||||
"$LANGFLOW_SERVER_URL/api/v1/run/$FLOW_ID?stream=true" \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "accept: application/json" \
|
||||
-H "x-api-key: sk-..." \
|
||||
|
|
@ -176,7 +177,7 @@ After you add a **Webhook** component to a flow, open the [**API access** pane](
|
|||
|
||||
```bash
|
||||
curl -X POST \
|
||||
"$LANGFLOW_URL/api/v1/webhook/$FLOW_ID" \
|
||||
"$LANGFLOW_SERVER_URL/api/v1/webhook/$FLOW_ID" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"data": "example-data"}'
|
||||
```
|
||||
|
|
|
|||
|
|
@ -16,7 +16,7 @@ You can use the Langflow API for programmatic interactions with Langflow, such a
|
|||
|
||||
To view and test all available endpoints, you can access the Langflow API's OpenAPI specification at your Langflow deployment's `/docs` endpoint, such as `http://localhost:7860/docs`.
|
||||
|
||||
:::tip
|
||||
:::tip Try it
|
||||
For an example of the Langflow API in a script, see the [Langflow quickstart](/get-started-quickstart).
|
||||
|
||||
The quickstart demonstrates how to get automatically generated code snippets for your flows, use a script to run a flow, and extract data from the Langfow API response.
|
||||
|
|
@ -24,10 +24,37 @@ The quickstart demonstrates how to get automatically generated code snippets for
|
|||
|
||||
## Form Langflow API requests
|
||||
|
||||
While individual parameters vary by endpoint, all Langflow API requests share some commonalities.
|
||||
While individual options vary by endpoint, all Langflow API requests share some commonalities, like a URL, method, parameters, and authentication.
|
||||
|
||||
As an example of a Langflow API request, the following curl command calls the `/v1/run` endpoint, and it passes a runtime override (`tweaks`) to the flow's Chat Output component:
|
||||
|
||||
```bash
|
||||
curl --request POST \
|
||||
--url "$LANGFLOW_SERVER_URL/api/v1/run/$FLOW_ID?stream=false" \
|
||||
--header "Content-Type: application/json" \
|
||||
--header "x-api-key: $LANGFLOW_API_KEY" \
|
||||
--data '{
|
||||
"input_value": "hello world!",
|
||||
"output_type": "chat",
|
||||
"input_type": "chat",
|
||||
"tweaks": {
|
||||
"ChatOutput-6zcZt": {
|
||||
"should_store_message": true
|
||||
}
|
||||
}
|
||||
}'
|
||||
```
|
||||
|
||||
### Base URL
|
||||
|
||||
<!-- For 1.5
|
||||
By default, local deployments serve the Langflow API at `http://localhost:7860/api`.
|
||||
|
||||
Remotely hosted Langflow deployments are available at the domain set by the hosting service, such as `http://IP_OR_DNS/api` or `http://IP_OR_DNS:LANGFLOW_PORT/api`.
|
||||
|
||||
You can configure the Langflow port number in the `LANGFLOW_PORT` [environment variable](/environment-variables).
|
||||
-->
|
||||
|
||||
Local deployments serve the Langflow API at `http://localhost:LANGFLOW_PORT/api`.
|
||||
The default port is 7868 or 7860:
|
||||
|
||||
|
|
@ -55,15 +82,18 @@ For more information, see [API keys](/configuration-api-keys).
|
|||
|
||||
Because authentication isn't always required, Langflow API examples in the Langflow documentation often omit authentication.
|
||||
|
||||
As with any API, follow industry best practices for storing and referencing sensitive credentials.
|
||||
For example, you can [set environment variables](#set-environment-variables) for your API keys, and then reference those environment variables in your API requests.
|
||||
|
||||
### Methods, paths, and parameters
|
||||
|
||||
Langflow API requests use a variety of methods, paths, path parameters, query parameters, and body parameters.
|
||||
Langflow API requests use various methods, paths, path parameters, query parameters, and body parameters.
|
||||
The specific requirements and options depend on the endpoint that you want to call.
|
||||
|
||||
For example, to create a flow, you pass a JSON-formatted flow definition to `POST /v1/flows`.
|
||||
Then, to run your flow, you call `POST /v1/run/$FLOW_ID` with optional run parameters in the request body.
|
||||
|
||||
### Versions
|
||||
### API versions
|
||||
|
||||
The Langflow API serves `/v1` and `/v2` endpoints.
|
||||
|
||||
|
|
@ -73,24 +103,38 @@ If a request fails or has an unexpected result, make sure your endpoint path has
|
|||
|
||||
## Set environment variables
|
||||
|
||||
As a best practice with any API, store commonly used values in environment variables to facilitate reuse, simplify token rotation, and securely reference sensitive values.
|
||||
You can use any method you prefer to set environment variables, such as `export`, `.env`, `zshrc`, or `.curlrc`.
|
||||
Additionally, be sure to follow industry best practices when storing credentials and other sensitive values.
|
||||
You can store commonly used values in environment variables to facilitate reuse, simplify token rotation, and securely reference sensitive values.
|
||||
|
||||
You might find it helpful to set environment variables for values like your Langflow server URL, Langflow API keys, flow IDs, and project IDs.
|
||||
You can use any method you prefer to set environment variables, such as `export`, `.env`, `zshrc`, or `.curlrc`.
|
||||
Then, reference those environment variables in your API requests.
|
||||
For example:
|
||||
|
||||
```bash
|
||||
export LANGFLOW_URL="http://localhost:7860"
|
||||
# Set environment variables
|
||||
export LANGFLOW_API_KEY="sk..."
|
||||
export LANGFLOW_SERVER_URL="https://localhost:7860"
|
||||
export FLOW_ID="359cd752-07ea-46f2-9d3b-a4407ef618da"
|
||||
export PROJECT_ID="1415de42-8f01-4f36-bf34-539f23e47466"
|
||||
export API_KEY="sk-..."
|
||||
|
||||
# Use environment variables in API requests
|
||||
curl --request POST \
|
||||
--url "$LANGFLOW_SERVER_URL/api/v1/run/$FLOW_ID$?stream=false" \
|
||||
--header "Content-Type: application/json" \
|
||||
--header "x-api-key: $LANGFLOW_API_KEY" \
|
||||
--data '{
|
||||
"input_value": "hello world!",
|
||||
"output_type": "chat",
|
||||
"input_type": "chat",
|
||||
"tweaks": {
|
||||
"ChatOutput-6zcZt": {
|
||||
"should_store_message": true
|
||||
}
|
||||
}
|
||||
}'
|
||||
```
|
||||
|
||||
:::tip
|
||||
- You can find flow IDs on the [Publish pane](/concepts-publish), in a flow's URL, and with [`GET /flows`](/api-flows#read-flows).
|
||||
- You can retrieve project IDs with `GET /projects`(/api-projects#read-projects).
|
||||
:::
|
||||
Commonly used values in Langflow API requests include your [Langflow server URL](#base-url), [Langflow API keys](/configuration-api-keys), flow IDs, and [project IDs](/api-projects#read-projects).
|
||||
|
||||
You can retrieve flow IDs from the [**API access** pane](/concepts-publish#api-access), in a flow's URL, and with [`GET /flows`](/api-flows#read-flows).
|
||||
|
||||
## Try some Langflow API requests
|
||||
|
||||
|
|
@ -104,7 +148,7 @@ Returns the current Langflow API version:
|
|||
|
||||
```bash
|
||||
curl -X GET \
|
||||
"$LANGFLOW_URL/api/v1/version" \
|
||||
"$LANGFLOW_SERVER_URL/api/v1/version" \
|
||||
-H "accept: application/json"
|
||||
```
|
||||
|
||||
|
|
@ -125,7 +169,7 @@ Returns configuration details for your Langflow deployment:
|
|||
|
||||
```bash
|
||||
curl -X GET \
|
||||
"$LANGFLOW_URL/api/v1/config" \
|
||||
"$LANGFLOW_SERVER_URL/api/v1/config" \
|
||||
-H "accept: application/json"
|
||||
```
|
||||
|
||||
|
|
@ -151,7 +195,7 @@ Returns a dictionary of all Langflow components:
|
|||
|
||||
```bash
|
||||
curl -X GET \
|
||||
"$LANGFLOW_URL/api/v1/all" \
|
||||
"$LANGFLOW_SERVER_URL/api/v1/all" \
|
||||
-H "accept: application/json"
|
||||
```
|
||||
|
||||
|
|
|
|||
|
|
@ -262,28 +262,35 @@ Peruvian writer and Nobel Prize in Literature laureate Mario Vargas Llosa (pictu
|
|||
|
||||
This component defines a webhook trigger that runs a flow when it receives an HTTP POST request.
|
||||
|
||||
If the input is not valid JSON, the component wraps it in a `payload` object so that it can be processed and still trigger the flow. The component does not require an API key.
|
||||
If the input is not valid JSON, the component wraps it in a `payload` object so that it can be processed and still trigger the flow.
|
||||
|
||||
When a **Webhook** component is added to the workspace, a new **Webhook cURL** tab becomes available in the **API** pane that contains an HTTP POST request for triggering the webhook component. For example:
|
||||
When you add a **Webhook** component to a flow, the flow's [**API access** pane](/concepts-publish#api-access) exposes an additional **Webhook cURL** tab that contains a `POST /v1/webhook/$FLOW_ID` code snippet.
|
||||
You can use this request to send data to the **Webhook** component and trigger the flow.
|
||||
For example:
|
||||
|
||||
```bash
|
||||
curl -X POST \
|
||||
"http://localhost:7860/api/v1/webhook/**YOUR_FLOW_ID**" \
|
||||
"$LANGFLOW_SERVER_URL/api/v1/webhook/$FLOW_ID" \
|
||||
-H 'Content-Type: application/json'\
|
||||
-d '{"any": "data"}'
|
||||
```
|
||||
|
||||
To test the webhook component:
|
||||
The **Webhook** component is often paired with a [**Parser** component](/components-processing#parser) to extract relevant data from the raw payload.
|
||||
For more information, see [Trigger flows with webhooks](/webhook).
|
||||
|
||||
1. Add a **Webhook** component to the flow.
|
||||
2. Connect the **Webhook** component's **Data** output to the **Data** input of a [Parser](/components-processing#parser) component.
|
||||
3. Connect the **Parser** component's **Parsed Text** output to the **Text** input of a [Chat Output](/components-io#chat-output) component.
|
||||
4. In the **Parser** component, under **Mode**, select **Stringify**.
|
||||
This mode passes the webhook's data as a string for the **Chat Output** component to print.
|
||||
5. To send a POST request, copy the code from the **Webhook cURL** tab in the **API** pane and paste it into a terminal.
|
||||
6. Send the POST request.
|
||||
7. Open the **Playground**.
|
||||
Your JSON data is posted to the **Chat Output** component, which indicates that the webhook component is correctly triggering the flow.
|
||||
To troubleshoot a flow with a **Webhook** component and verify that the component is receiving data, you can create a small flow that outputs only the parsed payload:
|
||||
|
||||
1. Create a flow with **Webhook**, **Parser**, and **Chat Output** components.
|
||||
2. Connect the Webhook component's **Data** output to the Parser component's **Data** input.
|
||||
3. Connect the Parser component's **Parsed Text** output to the Chat Output component's **Text** input.
|
||||
4. Edit the **Parser** component to set **Mode** to **Stringify**.
|
||||
|
||||
This mode passes the data received by the Webhook component as a string that is printed by the **Chat Output** component.
|
||||
|
||||
5. Click **Share**, select **API access**, and then copy the **Webhook cURL** code snippet.
|
||||
6. Optional: Edit the `data` in the code snippet if you want to pass a different payload.
|
||||
7. Send the POST request to trigger the flow.
|
||||
8. Click **Playground** to verify that the **Chat Output** component printed the JSON data from your POST request.
|
||||
|
||||
<details>
|
||||
<summary>Parameters</summary>
|
||||
|
|
|
|||
|
|
@ -5,39 +5,70 @@ slug: /webhook
|
|||
|
||||
import Icon from "@site/src/components/icon";
|
||||
|
||||
Add a **Webhook** component to your flow to trigger it with external requests.
|
||||
You can use the **Webhook** component to start a flow run in response to an external event.
|
||||
|
||||
To connect the **Webhook** to a **Parser** component to view and parse your data payload, do the following:
|
||||
With the **Webhook** component, a flow can receive data directly from external sources. Then, the flow can parse the data and pass it to other components in the flow to initiate other actions, such as calling APIs, writing to databases, and chatting with LLMs.
|
||||
|
||||
The **Webhook** component provides a versatile entrypoint that can make your flows more event-driven and integrated with your entire stack of applications and services.
|
||||
For example:
|
||||
|
||||
* Use an LLM to analyze the sentiment and content of customer feedback or survey responses.
|
||||
* Receive notifications from a monitoring system, and then trigger automated responses based on alert type and severity.
|
||||
* Integrate with e-commerce platforms to process orders and update inventory.
|
||||
|
||||
## Configure the Webhook component
|
||||
|
||||
To use the **Webhook** component in a flow, do the following:
|
||||
|
||||
1. In Langflow, open the flow where you want to use the **Webhook** component.
|
||||
|
||||
2. Add a [**Webhook** component](/components-data#webhook) and a [**Parser** component](/components-processing#parser) to your flow.
|
||||
|
||||
The **Parser** component extracts relevant data from the raw payload received by the **Webhook** component.
|
||||
|
||||
3. Connect the Webhook component's **Data** output to the Parser component's **Data** input.
|
||||
|
||||
4. In the Parser component's **Template** field, enter a template to parse the raw payload into structured text.
|
||||
|
||||
In the template, use variables for payload keys in the same way you would define variables in a [**Prompt** component](/components-prompts).
|
||||
|
||||
For example, assume that you expect your **Webhook** component to receive the following JSON data:
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "",
|
||||
"name": "",
|
||||
"email": ""
|
||||
}
|
||||
```
|
||||
|
||||
Then, you can use curly braces to reference the JSON keys anywhere in your parser template:
|
||||
|
||||
1. Add a **Webhook** component to your flow.
|
||||
2. Add a [Parser](/components-processing#parser) component to your flow.
|
||||
3. Connect the **Webhook** component's **Data** output to the **Parser** component's **Data** input.
|
||||
4. In the **Template** field of the **Parser** component, enter a template for parsing the **Webhook** component's input into structured text.
|
||||
:::important
|
||||
The component may fail to build because it needs data from the **Webhook** first.
|
||||
If you experience issues, change the **Mode** on the **Parser** component to **Stringify**, so the component outputs a single string.
|
||||
:::
|
||||
Create variables for values in the `template` the same way you would in a [Prompt](/components-prompts) component.
|
||||
For example, to parse `id`, `name`, and `email` strings:
|
||||
```text
|
||||
ID: {id} - Name: {name} - Email: {email}
|
||||
```
|
||||
|
||||
5. In the **Endpoint** field of the **Webhook** component, copy the API endpoint for your external requests.
|
||||
6. Optionally, to retrieve a complete example request from the component, click **Controls**, and then copy the command from the **cURL** value field.
|
||||
:::important
|
||||
The default curl command includes a field for `x-api-key`. This field is **optional** and can be deleted from the command if you aren't using authentication.
|
||||
:::
|
||||
7. Send a POST request with any data to trigger your flow.
|
||||
This example uses `id`, `name`, and `email` strings.
|
||||
Replace **FLOW_ID** with your flow's ID, which can be found on the [Publish pane](/concepts-publish) or in the flow's URL.
|
||||
5. Connect the Parser component's **Parsed Text** output to the next logical component in your flow, such as a Chat Input component.
|
||||
|
||||
If you want to test only the Webhook and Parser components, you can connect the **Parsed Text** output directly to a Chat Output component's **Text** input. Then, you can see the parsed data in the **Playground** after you run the flow.
|
||||
|
||||
6. From the Webhook component's **Endpoint** field, copy the API endpoint that you will use to send data to the Webhook component and trigger the flow.
|
||||
|
||||
Alternatively, to get a complete `POST /v1/webhook/$FLOW_ID` code snippet, open the flow's [**API access** pane](/concepts-publish#api-access), and then click the **Webhook cURL** tab.
|
||||
You can also modify the default curl command in the Webhook component's **cURL** field.
|
||||
If this field isn't visible by default, click the Webhook component, and then click **Controls** in the component's header menu.
|
||||
|
||||
7. Send a POST request with `data` to the flow's `webhook` endpoint to trigger the flow.
|
||||
|
||||
The following example sends a payload containing `id`, `name`, and `email` strings:
|
||||
|
||||
```bash
|
||||
curl -X POST "http://localhost:7860/api/v1/webhook/YOUR_FLOW_ID" \
|
||||
curl -X POST "$LANGFLOW_SERVER_URL/api/v1/webhook/$FLOW_ID" \
|
||||
-H 'Content-Type: application/json' \
|
||||
-d '{"id": "12345", "name": "alex", "email": "alex@email.com"}'
|
||||
```
|
||||
|
||||
This response indicates Langflow received your request:
|
||||
A successful response indicates that Langflow started the flow:
|
||||
|
||||
```json
|
||||
{
|
||||
|
|
@ -46,18 +77,25 @@ Replace **FLOW_ID** with your flow's ID, which can be found on the [Publish pane
|
|||
}
|
||||
```
|
||||
|
||||
1. To view the data received from your request, in the **Parser** component, click <Icon name="TextSearch" aria-hidden="true"/> **Inspect output**.
|
||||
The output for the entire flow isn't returned by the `webhook` endpoint.
|
||||
|
||||
You should receive a string of parsed text, like `ID: 12345 - Name: alex - Email: alex@email.com`.
|
||||
8. To view the flow's most recent parsed payload, click the **Parser** component, and then click <Icon name="TextSearch" aria-hidden="true"/> **Inspect output**.
|
||||
For the preceding example, the parsed payload would be a string like `ID: 12345 - Name: alex - Email: alex@email.com`.
|
||||
|
||||
You have successfully parsed data out of an external JSON payload.
|
||||
## Troubleshoot Parser component build failure
|
||||
|
||||
By passing the event trigger data payload directly into a flow, you can also parse the event data with a chain of components, and use its data to trigger other events.
|
||||
The **Parser** component can fail to build if it doesn't receive data from the **Webhook** component or if there is a problem with the incoming data.
|
||||
|
||||
If this occurs, try changing the Parser component's **Mode** to **Stringify** so that the component outputs the parsed payload as a single string.
|
||||
|
||||
Then, you can examine the string output and troubleshoot your parsing template, or work with the parsed data in string form.
|
||||
|
||||
## Trigger flows with Composio webhooks
|
||||
|
||||
Now that you've triggered the webhook component manually, follow along with this step-by-step video guide for triggering flows with payloads from external applications: [How to Use Webhooks in Langflow](https://www.youtube.com/watch?v=IC1CAtzFRE0).
|
||||
Typically, you won't manually trigger the webhook component.
|
||||
To learn about triggering flows with payloads from external applications, see the video tutorial [How to Use Webhooks in Langflow](https://www.youtube.com/watch?v=IC1CAtzFRE0).
|
||||
|
||||
## See also
|
||||
|
||||
- [Webhook component](/components-data#webhook)
|
||||
- [Flow trigger endpoints](/api-flows-run)
|
||||
Loading…
Add table
Add a link
Reference in a new issue