* take0 * previous response id and streaming fix * add tool calls to streaming responses * integration test * integration test * better response output -- works with sdk * [autofix.ci] apply automated fixes * Refactor logging in integration tests to use loguru for improved clarity and consistency. Replace print statements with appropriate logging levels, enhancing error handling and debugging capabilities. * Refactor `has_chat_input` function to use `any()` for improved readability. Enhance error messaging in `run_flow_for_openai_responses` for clarity. Update response yielding format for better readability. Add noqa comments for linting compliance. * Add comprehensive OpenAI compatibility tests for edge cases and error handling Extend integration test coverage with new test file containing validation for empty inputs, invalid models, tools parameter rejection, timeout scenarios, and concurrent request handling. Update existing integration tests to improve error handling and response validation. * streaming * [autofix.ci] apply automated fixes * format * loguru * [autofix.ci] apply automated fixes * ruff * delta fix * support includes [tool_call.results] in response.output_item.done * [autofix.ci] apply automated fixes * include results non streaming * [autofix.ci] apply automated fixes * support global variable override with http headers via openai response api * [autofix.ci] apply automated fixes * ruff * ruff * ruff * ruff * ruff * refactor: use Depends for telemetry service * [autofix.ci] apply automated fixes * load_dotenv(find_dotenv()) * check for chat output * openai key dummy check * [autofix.ci] apply automated fixes * mypy fix * fix: specify type for tool_calls in openai_responses (#9530) fix: specify type for tool_calls in openai_responses.py Updated the type annotation for the tool_calls variable to explicitly define it as a list of dictionaries with string keys and Any values, enhancing type safety and clarity in the code. Co-authored-by: Sebastián Estévez <estevezsebastian@gmail.com> * import fix * [autofix.ci] apply automated fixes * response_model=None is required becase create_response 3 different types --------- Co-authored-by: phact <estevez.sebastian@gmail.com> Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com> Co-authored-by: Gabriel Luiz Freitas Almeida <gabriel@langflow.org> Co-authored-by: Edwin Jose <edwin.jose@datastax.com> Co-authored-by: Carlos Coelho <80289056+carlosrcoelho@users.noreply.github.com> Co-authored-by: Jordan Frazier <122494242+jordanrfrazier@users.noreply.github.com> |
||
|---|---|---|
| .cursor | ||
| .devcontainer | ||
| .github | ||
| .vscode | ||
| deploy | ||
| docker | ||
| docker_example | ||
| docs | ||
| scripts | ||
| src | ||
| test-results | ||
| .coderabbit.yaml | ||
| .composio.lock | ||
| .dockerignore | ||
| .env.example | ||
| .eslintrc.json | ||
| .gitattributes | ||
| .gitignore | ||
| .pre-commit-config.yaml | ||
| CODE_OF_CONDUCT.md | ||
| codecov.yml | ||
| CONTRIBUTING.md | ||
| DEVELOPMENT.md | ||
| LICENSE | ||
| Makefile | ||
| Makefile.frontend | ||
| pyproject.toml | ||
| README.md | ||
| RELEASE.md | ||
| render.yaml | ||
| SECURITY.md | ||
| uv.lock | ||
Caution
Users must update to Langflow >= 1.3 to protect against CVE-2025-3248.
Langflow is a powerful tool for building and deploying AI-powered agents and workflows. It provides developers with both a visual authoring experience and built-in API and MCP servers that turn every workflow into a tool that can be integrated into applications built on any framework or stack. Langflow comes with batteries included and supports all major LLMs, vector databases and a growing library of AI tools.
✨ Highlight features
- Visual builder interface to quickly get started and iterate .
- Source code access lets you customize any component using Python.
- Interactive playground to immediately test and refine your flows with step-by-step control.
- Multi-agent orchestration with conversation management and retrieval.
- Deploy as an API or export as JSON for Python apps.
- Deploy as an MCP server and turn your flows into tools for MCP clients.
- Observability with LangSmith, LangFuse and other integrations.
- Enterprise-ready security and scalability.
⚡️ Quickstart
Langflow requires Python 3.10 to 3.13 and uv.
- To install Langflow, run:
uv pip install langflow -U
- To run Langflow, run:
uv run langflow run
- Go to the default Langflow URL at
http://127.0.0.1:7860.
For more information about installing Langflow, including Docker and Desktop options, see Install Langflow.
📦 Deployment
Langflow is completely open source and you can deploy it to all major deployment clouds. To learn how to use Docker to deploy Langflow, see the Docker deployment guide.
⭐ Stay up-to-date
Star Langflow on GitHub to be instantly notified of new releases.
👋 Contribute
We welcome contributions from developers of all levels. If you'd like to contribute, please check our contributing guidelines and help make Langflow more accessible.