* Update styleUtils.ts * update to prompt component * update to template * update to mcp component * update to smart function * [autofix.ci] apply automated fixes * update to templates * fix sidebar * change name * update import * update import * update import * [autofix.ci] apply automated fixes * fix import * fix ollama * fix ruff * refactor(agent): standardize memory handling and update chat history logic (#8715) * update chat history * update to agents * Update Simple Agent.json * update to templates * ruff errors * Update agent.py * Update test_agent_component.py * [autofix.ci] apply automated fixes * update templates * test fix --------- Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com> Co-authored-by: Mike Fortman <michael.fortman@datastax.com> * fix prompt change * feat(message): support sequencing of multiple streamable models (#8434) * feat: update OpenAI model parameters handling for reasoning models * feat: extend input_value type in LCModelComponent to support AsyncIterator and Iterator * refactor: remove assert_streaming_sequence method and related checks from Graph class * feat: add consume_iterator method to Message class for handling iterators * test: add unit tests for OpenAIModelComponent functionality and integration * feat: update OpenAIModelComponent to include temperature and seed parameters in build_model method * feat: rename consume_iterator method to consume_iterator_in_text and update its implementation for handling text * feat: add is_connected_to_chat_output method to Component class for improved message handling * feat: refactor LCModelComponent methods to support asynchronous message handling and improve chat output integration * refactor: remove consume_iterator_in_text method from Message class and clean up LCModelComponent input handling * fix: update import paths for input components in multiple starter project JSON files * fix: enhance error message formatting in ErrorMessage class to handle additional exception attributes * refactor: remove validate_stream calls from generate_flow_events and Graph class to streamline flow processing * fix: handle asyncio.CancelledError in aadd_messagetables to ensure proper session rollback and retry logic * refactor: streamline message handling in LCModelComponent by replacing async invocation with synchronous calls and updating message text handling * refactor: enhance message handling in LCModelComponent by introducing lf_message for improved return value management and updating properties for consistency * feat: add _build_source method to Component class for enhanced source handling and flexibility in source object management * feat: enhance LCModelComponent by adding _handle_stream method for improved streaming response handling and refactoring chat output integration * feat: update MemoryComponent to enhance message retrieval and storage functionality, including new sender type handling and output options for text and dataframe formats * test: refactor LanguageModelComponent tests to use ComponentTestBaseWithoutClient and add tests for Google model creation and error handling * test: add fixtures for API keys and implement live API tests for OpenAI, Anthropic, and Google models * fix: reorder JSON properties for consistency in starter projects * Updated JSON files for various starter projects to ensure consistent ordering of properties, specifically moving "type" to follow "selected_output" for better readability and maintainability. * Affected files: Basic Prompt Chaining.json, Blog Writer.json, Financial Report Parser.json, Hybrid Search RAG.json, SEO Keyword Generator.json. * refactor: simplify input_value type in LCModelComponent * Updated the input_value parameter in LCModelComponent to remove AsyncIterator and Iterator types, streamlining the input options to only str and Message for improved clarity and maintainability. * This change enhances the documentation and understanding of the expected input types for the component. * fix: clarify comment for handling source in Component class * refactor: remove unnecessary mocking in OpenAI model integration tests * auto update * update * [autofix.ci] apply automated fixes * fix openai import * revert template changes * test fixes * update templates * [autofix.ci] apply automated fixes * fix tests * fix order * fix prompts import * fix frontend tests * fix frontend * [autofix.ci] apply automated fixes * add charmander * [autofix.ci] apply automated fixes * fix prompt frontend * fix frontend * test fix * [autofix.ci] apply automated fixes * change pokedex * remove pokedex extra * update template * name fix * update template * mcp test fix --------- Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com> Co-authored-by: cristhianzl <cristhian.lousa@gmail.com> Co-authored-by: Yuqi Tang <yuqi.tang@datastax.com> Co-authored-by: Mike Fortman <michael.fortman@datastax.com> Co-authored-by: Gabriel Luiz Freitas Almeida <gabriel@langflow.org> |
||
|---|---|---|
| .cursor/rules | ||
| .devcontainer | ||
| .github | ||
| .vscode | ||
| deploy | ||
| docker | ||
| docker_example | ||
| docs | ||
| scripts | ||
| src | ||
| test-results | ||
| .coderabbit.yaml | ||
| .composio.lock | ||
| .env.example | ||
| .eslintrc.json | ||
| .gitattributes | ||
| .gitignore | ||
| .pre-commit-config.yaml | ||
| CODE_OF_CONDUCT.md | ||
| CONTRIBUTING.md | ||
| DEVELOPMENT.md | ||
| eslint.config.js | ||
| LICENSE | ||
| Makefile | ||
| pyproject.toml | ||
| README.md | ||
| render.yaml | ||
| SECURITY.md | ||
| uv.lock | ||
Caution
Users must update to Langflow >= 1.3 to protect against CVE-2025-3248.
Langflow is a powerful tool for building and deploying AI-powered agents and workflows. It provides developers with both a visual authoring experience and built-in API and MCP servers that turn every workflow into a tool that can be integrated into applications built on any framework or stack. Langflow comes with batteries included and supports all major LLMs, vector databases and a growing library of AI tools.
✨ Highlight features
- Visual builder interface to quickly get started and iterate .
- Source code access lets you customize any component using Python.
- Interactive playground to immediately test and refine your flows with step-by-step control.
- Multi-agent orchestration with conversation management and retrieval.
- Deploy as an API or export as JSON for Python apps.
- Deploy as an MCP server and turn your flows into tools for MCP clients.
- Observability with LangSmith, LangFuse and other integrations.
- Enterprise-ready security and scalability.
⚡️ Quickstart
Langflow requires Python 3.10 to 3.13 and uv.
- To install Langflow, run:
uv pip install langflow -U
- To run Langflow, run:
uv run langflow run
- Go to the default Langflow URL at
http://127.0.0.1:7860.
For more information about installing Langflow, including Docker and Desktop options, see Install Langflow.
📦 Deployment
Langflow is completely open source and you can deploy it to all major deployment clouds. To learn how to use Docker to deploy Langflow, see the Docker deployment guide.
⭐ Stay up-to-date
Star Langflow on GitHub to be instantly notified of new releases.
👋 Contribute
We welcome contributions from developers of all levels. If you'd like to contribute, please check our contributing guidelines and help make Langflow more accessible.