From 93869c5c8f747d14564f7ecc949ae280b3496319 Mon Sep 17 00:00:00 2001 From: Mendon Kissling <59585235+mendonk@users.noreply.github.com> Date: Fri, 26 Apr 2024 09:13:20 -0400 Subject: [PATCH] paste --- docs/docs/guides/basic-prompting.mdx | 79 ---------------------------- 1 file changed, 79 deletions(-) diff --git a/docs/docs/guides/basic-prompting.mdx b/docs/docs/guides/basic-prompting.mdx index ece039fe2..08ec0259e 100644 --- a/docs/docs/guides/basic-prompting.mdx +++ b/docs/docs/guides/basic-prompting.mdx @@ -13,21 +13,6 @@ This article demonstrates how to use Langflow's prompt tools to issue basic prom ## Prerequisites -1. Install Langflow.import ThemedImage from "@theme/ThemedImage"; -import useBaseUrl from "@docusaurus/useBaseUrl"; -import ZoomableImage from "/src/theme/ZoomableImage.js"; -import ReactPlayer from "react-player"; - -# Basic prompting - -Prompts serve as the inputs to a large language model (LLM), acting as the interface between human instructions and computational tasks. - -By submitting natural language requests in a prompt to an LLM, you can obtain answers, generate text, and solve problems. - -This article demonstrates how to use Langflow's prompt tools to issue basic prompts to an LLM, and how various prompting strategies can affect your outcomes. - -## Prerequisites - 1. Install Langflow. ```bash python -m pip install langflow --pre @@ -96,67 +81,3 @@ The response will be markedly different. - -```bash -pip install langflow -``` - -2. Start a local Langflow instance with the Langflow CLI: -```bash -langflow run -``` -Or start Langflow with Python: -```bash -python -m langflow run -``` - -Result: -``` -│ Welcome to ⛓ Langflow │ -│ │ -│ Access http://127.0.0.1:7860 │ -│ Collaborate, and contribute at our GitHub Repo 🚀 │ -``` - -Alternatively, go to [HuggingFace Spaces](https://docs.langflow.org/getting-started/hugging-face-spaces) or [Lightning.ai Studio](https://lightning.ai/ogabrielluiz-8j6t8/studios/langflow) for a pre-built Langflow test environment. - -3. Create an [OpenAI API key](https://platform.openai.com). - -## Create the basic prompting project - -1. From the Langflow dashboard, click **New Project**. -2. Select **Basic Prompting**. -3. The **Basic Prompting** flow is created. - - - -This flow allows you to chat with the **OpenAI** component via a **Prompt** component. -Examine the **Prompt** component. The **Template** field instructs the LLM to `Answer the user as if you were a pirate.` -This should be interesting... - -4. To create an environment variable for the **OpenAI** component, in the **OpenAI API Key** field, click the **Globe** button, and then click **Add New Variable**. - 1. In the **Variable Name** field, enter `openai_api_key`. - 2. In the **Value** field, paste your OpenAI API Key (`sk-...`). - 3. Click **Save Variable**. - -## Run the basic prompting flow - -1. Click the **Run** button. -The **Interaction Panel** opens, where you can converse with your bot. -2. Type a message and press Enter. -The bot responds in a markedly piratical manner! - -## Modify the prompt for a different result - -1. To modify your prompt results, in the **Prompt** template, click the **Template** field. -The **Edit Prompt** window opens. -2. Change `Answer the user as if you were a pirate` to a different character, perhaps `Answer the user as if you were Harold Abelson.` -3. Run the basic prompting flow again. -The response will be markedly different. \ No newline at end of file