docs: fix localhost address for NIM docs (#7391)

Fix localhost address for NIM docs
This commit is contained in:
Jordan Frazier 2025-04-01 11:37:58 -07:00 committed by GitHub
commit 4162d11710
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -24,7 +24,7 @@ To connect the NIM you've deployed with Langflow, add the **NVIDIA** model compo
1. Create a [basic prompting flow](/get-started-quickstart).
2. Replace the **OpenAI** model component with the **NVIDIA** component.
3. In the **NVIDIA** component's **Base URL** field, add the URL where your NIM is accessible. If you followed your model's [deployment instructions](https://build.nvidia.com/nv-mistralai/mistral-nemo-12b-instruct/deploy?environment=wsl2.md), the value is `http://0.0.0.0:8000/v1`.
3. In the **NVIDIA** component's **Base URL** field, add the URL where your NIM is accessible. If you followed your model's [deployment instructions](https://build.nvidia.com/nv-mistralai/mistral-nemo-12b-instruct/deploy?environment=wsl2.md), the value is `http://localhost:8000/v1`.
4. In the **NVIDIA** component's **NVIDIA API Key** field, add your NVIDIA API Key.
5. Select your model from the **Model Name** dropdown.
6. Open the **Playground** and chat with your **NIM** model.