* feat: Add dual output support to Agent component with structured JSON parsing ## Summary - Add "Structured Response" output alongside existing "Response" output - Filter out conflicting json_mode field from OpenAI inputs - Implement robust JSON parsing with fallback handling ## Changes Made ### Agent Component (agent.py) - Add second output: "Structured Response" (Data type) with tool_mode=False - Filter json_mode from OpenAI inputs to prevent UI conflicts - Add json_response() method with multi-stage JSON parsing: - Direct JSON parsing for valid responses - Regex extraction for embedded JSON in text - Graceful error handling with diagnostic info - Share execution between outputs (no duplicate agent runs) - Fix model building to handle missing json_mode attribute ### Tests (test_agent_component.py) - Add 9 comprehensive test cases covering: - Dual output structure validation - Input filtering verification - JSON parsing (valid, embedded, error cases) - Model building without json_mode - Shared execution efficiency - Frontend node structure - Component initialization ## Benefits - Users get both Message and Data output types to choose from - Clean UI without confusing duplicate JSON toggles - Robust JSON parsing handles various response formats - Efficient single-execution approach - Maintains backward compatibility 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com> * [autofix.ci] apply automated fixes * update to templates with model list update * [autofix.ci] apply automated fixes * Update test_agent_component.py * update to the test and update to templates * [autofix.ci] apply automated fixes --------- Co-authored-by: Claude <noreply@anthropic.com> Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com> Co-authored-by: Edwin Jose <edwin.jose@datastax.com> |
||
|---|---|---|
| .cursor/rules | ||
| .devcontainer | ||
| .github | ||
| .vscode | ||
| deploy | ||
| docker | ||
| docker_example | ||
| docs | ||
| scripts | ||
| src | ||
| test-results | ||
| .coderabbit.yaml | ||
| .composio.lock | ||
| .env.example | ||
| .eslintrc.json | ||
| .gitattributes | ||
| .gitignore | ||
| .pre-commit-config.yaml | ||
| CODE_OF_CONDUCT.md | ||
| codecov.yml | ||
| CONTRIBUTING.md | ||
| DEVELOPMENT.md | ||
| LICENSE | ||
| Makefile | ||
| Makefile.frontend | ||
| pyproject.toml | ||
| README.md | ||
| render.yaml | ||
| SECURITY.md | ||
| uv.lock | ||
Caution
Users must update to Langflow >= 1.3 to protect against CVE-2025-3248.
Langflow is a powerful tool for building and deploying AI-powered agents and workflows. It provides developers with both a visual authoring experience and built-in API and MCP servers that turn every workflow into a tool that can be integrated into applications built on any framework or stack. Langflow comes with batteries included and supports all major LLMs, vector databases and a growing library of AI tools.
✨ Highlight features
- Visual builder interface to quickly get started and iterate .
- Source code access lets you customize any component using Python.
- Interactive playground to immediately test and refine your flows with step-by-step control.
- Multi-agent orchestration with conversation management and retrieval.
- Deploy as an API or export as JSON for Python apps.
- Deploy as an MCP server and turn your flows into tools for MCP clients.
- Observability with LangSmith, LangFuse and other integrations.
- Enterprise-ready security and scalability.
⚡️ Quickstart
Langflow requires Python 3.10 to 3.13 and uv.
- To install Langflow, run:
uv pip install langflow -U
- To run Langflow, run:
uv run langflow run
- Go to the default Langflow URL at
http://127.0.0.1:7860.
For more information about installing Langflow, including Docker and Desktop options, see Install Langflow.
📦 Deployment
Langflow is completely open source and you can deploy it to all major deployment clouds. To learn how to use Docker to deploy Langflow, see the Docker deployment guide.
⭐ Stay up-to-date
Star Langflow on GitHub to be instantly notified of new releases.
👋 Contribute
We welcome contributions from developers of all levels. If you'd like to contribute, please check our contributing guidelines and help make Langflow more accessible.